I need to write text files of the size more than 1GB! First thought of doing this with BCP but then fetching data from stored proc was taking too long (more than one hour) and so BCP did not seem to be the right option as we have less flexiblity in terms - handling errors, knowing the status as to what point the writing has been complete etc. Most importantly getting such a huge amount of data did not seem to be a one time job! So I am now thinking about writing a CLR procedure which fetches smaller amount of data at a time and writing to file. Something like - open connection, get data, close connection, write to file and continue in cycles of this till whole file is written. Does this makes sense considering the amount of data to be fetched?
Any suggestion/comment on this will be much helpful!
I have SQL Server 2005 Express Edition with Advanced Services running on a small web server. It all runs fine, but every now and then the log files grow and grow and eventually use up all the disk space of 30GB. As a quick fix, restarting SQL a couple of times clears out the logs and everything is up and running again. Any ideas on how to stop this happening?
Hi there - can anyone advise on the following issue. We have recently performed some server side tracing on a particular SQL instance over 24hr period. We are now attempting to load these into a database for analysis. Here lies the problem.
When we are loading the profiler trace files (one at a time) into the database the transaction log is growing at an excessive rate. Even though the database is in SIMPLE mode.
We are loading the traces using the command:
INSERT INTO sqlTableToLoad SELECT * FROM ::fn_trace_gettable('MytraceFileName', DEFAULT)
Can anyone advise how we could possibly get round this issue as we're running out of space due to the transaction log.
Hi, I'm relatively new to SQL and ASP.NET and I'm after some assistance to what I hope is an easily achievable goal. I need to upload some files (.pdf, .doc, .jpg - whatever) to a directory on my webserver. Once this has been done, I'd like to write some data to an SQL Database Table that will contain information about these files in a datagrid (or something similar) along with a link to download them. E.g I upload car.jpg which is located at /images/secure/car.jpg and is a photo of car. When a user logs in, they will be shown a datagrid which pulls information from said SQL database and in this datagrid will be something like: Filename Description Link car.jpg A pic of a car Download
If anyone is able to help me out with regards to linking to a file from an SQL dB it would be greatly appreciated, and if you require any further information just let me know :) Ben.
Hi guys,I have a stored procedure that I will subsequently post below thismessage. What the stored procedure does is, it reads some specifictables in a database and created XML off it.The problem comes in when the data is bigger than a few hundred/thousand characters. The data is truncated and the XML file is notfully made.Now i have been reading some posts by some people and tried using TEXT/NTEXT/ IMAGE type to write the files but they give me errors which,again, I am adding to this email.STORED PROC:****************************CREATE PROCEDURE usrp_sp_makexmlfile@str varchar(50),@templatefile varchar(255),@ReturnValue as image OUTASSET NOCOUNT ONDECLARE @strVar as varchar(8000)/*DECLARE @ReturnValue as varchar(255)*/Set @strVar = 'Select * from ' + @str + ' FOR XML AUTO'Set @templatefile = 'c: emplate.tpl'EXEC (@strVar)SELECT @ReturnValueGOTHIS IS WHAT I GET WHEN I USE IMAGE/NTEXT/TEXT TYPE:************************************************** *******An unhandled exception of type 'System.InvalidOperationException'occurred in system.data.dllAdditional information: Parameter 2: '@ReturnValue' of type: Byte[],the property Size has an invalid size: 0Also, if i try to add an integer value to the image/text/ntext like8000 or something less than that, it tells me that the result has to bediscarded because the current process has had some error.Can someone give me a suggestion on how to resolve this or an examplethat would illustrate the solution? Thank you all in advance.Pi.
Hi guys,I have been trying to write data from a database to XML files andalthough i have been successful in doing that, the file(s) that are madeare not of the form that I would want.I am using:SQLDataSet.WriteXml(strFileName, XmlWriteMode.WriteSchema);to generate the files where strFileName is the name of the file. Thefiles when viewed with XML Editor are seen as lines of data instead ofthe regularly generated XML where each row is one below the other. Whenviewed with WordPad/Notepad i see elements />< betweeen the dataand somehow I think this has to do something with the problem.Can someone help me with this?? Any help or suggestion would be greatlyappreciated.Thanks.*** Sent via Developersdex http://www.developersdex.com ***
Hi;Before I start let me say I know this is a silly way to go aboutthings, but it is one of those "my company made me do it" things.My company is in the process of ( finishing ) migrating from foxpro tosqlserver.They have a foxpro program that does a lot of various updates. Iwould like to write a tsql script to replace it.The trick is that the tsql script would have to have some proceduralprogramming artifacts that I don't think it has.I/my company would like the script to generate output messages thatthe user can see while the script is running. Everytime I have ran ascript in query analyzer I usually don't see the output ("print"statemetns, etc ) until all of the work isdone. Is there a way to run thing so I can see the output statementsas the work is being done?Can print statements work from within a tsql loop?The other thing is that I/my company wants the tsql script to writeoutput messages to a log file while it is operating ( for errorchecking and other logging).Is this possible with tsql?Thanks in advance for the infoSteve
I am having a scenario where all the stored procedures are stored in a folder (one sql file per sproc). Stored procedure does not have 'IF Exists .... DROP Procedure' in the script so before creating them we have to drop all sproc manually.
Can anyone help me writing a script / SSIS process to loop through each file in folder and write "IF EXISTS ... DROP PROCEDURE" with the procedure name in it ??
I can create a package that loop through each file in FOR each Loop task but dont know how to write in file using .net
We have a package that is using a ForEach loop container to access files on a network drive. For some reason I am getting a message that the ForEach enumerator is empty and did not find any files that matched the pattern. For the pattern I left the default *.* for testing purposes. I have specified the file folder as \remoteserverfilesharesubfolder and also as \remoteserverc$filesharesubfolder and have gotten the same message. However when I map a network drive and set the file folder to the network drive it finds the files. Is this a permissions issue?
After I finish processing the file I want to move it to a new directory. Once this is deployed in production, the package will not be running under a domain account and probably won't have access to the network folder. Is there any way to specifiy in the connection manager itself that it should use a specific account to access the folder?
Is there a quick way to extract a full dump of 50 tables to 50 corresponding text files?
i.e.
table_a has to be extracted to table_a.txt
table_b has to be extracted to table_b.txt
table_c has to be extracted to table_c.txt etc.
I don't want to have to add each one separately by hand in the DTSX package designer. I can't see any way to do it in a loop (because you have to do the field mapping). I can't seem to get the DTS Wizard to help - it only seems to be able to handle one table-to-text extract at any one time. And I've tried editing the DTXS file directly (in XML) but it looks like it's going to be rather complex, even if I only do it to define the connection managers. Feel free to suggest any better way to do this, though the specification has already been agreed, so I'm unlikely to be able to change it. Thanks
I want to write a batch file that will do just that. The problem is bcp or bulkcopy never works with my code, it doesn't recognize it. Maybe I'm going the wrong way to do this, but I could use some help. Ideally I'll make the batch to do the importing via bulkcopy or something like it of a text file into a table and use windows scheduler to automate it. Be as specific as you can please, I'm very new to sql server.
I just peeked at my DNN setup and I found that I have a transaction log about 98 gigs large, compared to a DNN database that is only about 250 megs. Crazy, huh?
Do you happen to know what I need that transaction log for? Can I just delete it or will it break my SQL db? Is there a way that I can keep only maybe a week of transactions in it so it doesn't grow so dang large?
One of my production databases is currently 51 mb. The transaction log is well over 5 gig. I have tried truncating and then shrinking the log through the use of SQL utilities. This does not work! How can I quickly resolve this problem without tampering with the production environment?
I am not a DBA and I run a personal web site that has gotten pretty large. I have never done anything to maintain my sql server, and now my transaction log is 10 Gigs and my data is only like 300 Megs. I am starting to get a memory leak with the sql service. What should I do? Is it bad to have a huge transaction log. I am not familiar with any of this stuff, so someone please point me in the right direction.
Hi all, I found my database log file is 26GB and the database file is just about 280MB. We are doing full backup everyday. However, my sql server seems running very slow now and please advise:
1. How can I decrease/truncate my log file? 2. Would the huge size of the log file be the reasons slowing up my sql server? 3. Would anyone give me direction knowing more on the transaction log? Thank you and appreciated!
Hi guys, its my first post! Its also like my first time really diving into sql. We are using sharepoint on site here along with sql server 2005, one of our log files is 255 GBs and needs to be made smaller very fast!! We are almost out of disk space and the log is growing fast.
I am very new to sql and dont even know where to go to enter commands, so youll have to bear with me here. I've read about truncating and shrinking and some other things, I am just worried and dont want to mess anything up. I know this is probably a simple task, but like I said, with the truncate command I was reading about, I dont even know where to go to type it in!!! If someone could please help it would be much appreciated. Thanks so much.
I have a huge dataset in MS SQL Server, over 11 million records of machine data taken every four seconds. I really don't need samples of this data at this interval. I'd like to run a query to retrieve my data for every 30 minute interval. I know enough SQL and DTS to SELECT was fields I want, and how to direct it to a CSV file, but I'm not sure how to manipulate the TimeStamp field in the query to only pull data every 450 records.
I have a problem and I am sure someone here can help. Lat night my DB was working fine, as it has been . This morning I get to the office and now everything has gone to hell in a handbag .
I can no longer connect to my sql2005 DB I get this error when trying to place an order on our order page.
"There was a problem with the website:An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: TCP Provider, error: 0 - No connection could be made because the target machine actively refused it.) The error has been logged."
I also notice the sql agent will not start. Any the former owner of the Co. had a eval version of sql management studio the must have just expired(could this be causing it?)
I have a .bak file of 72gb. But my database size is only 32gb, I got this value from sp_spaceused? Anyone know why the .bak file is so big?. Is it possible to reduce the size? How could i reduce it?
Hello, I have a very big T-SQL script (~24mb) and I need the SQL Server on my hosted site to execute it.
Right now, I have a web page that uses Sql.Connection.ExecuteNonQuery () to do it, but the .NET process runs out of memory when I load the file into a C# string.
How would I go about executing this T-SQL script on the server? Is there such a command: EXECUTE SCRIPT "myscript.sql" FROM DISC ?
Hi this is regarding SQL Sever 2000. ( it was upgraded form sql7). its log file is increasing in very high manner. say 40 gb, 50 gb and now 57 gb. Mdf file is around 15 mb. we created back up and tried to restore to another system. its asking 57 gb free space. how to proceeed with file recovery. we have backups but it askes more space for log file. how to retrieve the data. rgds Pramod
Good evening: We're porting an old app written in ASP.NET 1.1433 and SQL 2000 to ASP.NET 2.0 and SQL 2005. In the old app we have a few data grids that are populated from a dataset pulled from the database. We use a SQL query that we build based on more than 10 different user inputs, the result of which is an enormously complicated SQL string. We'd like to move this processing into a SPROC in the 2005 database. Rather than writing stored procedures to create the SQL SELECT statement, is it possible to pass an entire select state ment to a SPROC and have it executed within? We're trying to capitalize on paging in 2005 using ...ROW_NUMBER() OVER (ORDER BY PM ASC)... and building the string using IF ELSE statements is mind numbling complex and tedious. And suggestions woudl be great. Thanks, Brad
I need to alter a table (expand the column size for varchar(10) to varchar(255)) and the table has 200 million rows. Please suggest me the best and the fastest method to achieve it. The database is on SQL 7.0
I am using SQL Server 7 and have about 5 databases. One of them has a data file of about 10 Meg, and most of the others are larger. I do a nightly backup to both a local and mapped drive. On both, the size of the backup file for this database is more than 500 Meg, but the rest appear to be an appropriate size. Does anyone know why this would be happening? The database works fine, it does not get a lot of insert/delete activity and I run DBCC every weekend. If anyone has any ideas I would sure like to hear from them.
Does anybody know why BCP on v6.5 grabs so much memory for SQL Server? I have a few table imports where the BCP process will consume over 460MB of RAM during the imports.
The BCP cmd file is executed via an xp_cmdshell call. The server has 2+GB of RAM, but the BCP process effectively flushes large amounts of data from the buffer. It takes quite along time for the cache to recover from this, and after this, the rest of the nightly processes run much slower, as they end up having to hit the drives to retrieve information that should already be in cache.
If anyone can shed some light on this it would be much appreciated.
I have a huge log file (285M) on SQL Server 7. The database itself is about 10M. How can I reduce the log file ? Is it possible to build it again from scratch ?
I tried the Truncate Transaction Log but it didn't help.