how do i insert a large chunk of text into a table column. my project is to build a news website. where people can go and read news articles. the articles are provided by the author in word format, so how do i insert that news article into the table's column? any help would be appreciated
I have a web form with a text field that needs to take in as much as the user decides to type and insert it into an nvarchar(max) field in the database behind. I've tried using the new .write() method in my update statement, but it cuts off the text after a while. Is there a way to insert/update in SQL 2005 this without resorting to Bulk Insert? It bloats the transaction log and turning the logging off requires a call to sp_dboptions (or a straight-up ALTER DATABASE), which I'd like to avoid if I can.
Hi€¦ During my web search looking for a solution I ran across SQL CE 3.5 articles. My questions about SQL CE 3.5 are: 1) Can SQL CE 3.5 handle a 4 €“ 6 GB file - Read - Parse (SQL) 2) Can SQL CE 3.5 act as a standalone client that a user can view a large (4-6 GB) text file? - Will I need a .NET (small) client to read the large (4-6 GB) text file? More info: The text file will reside on the machine where the SQL CE 3.5 is installed. There is no pull to get the data.
What is the easiest way to get a large fixed width text file (200 columns) defintion into SSIS? To have to define each column with the ruler would be very cumbersome.
I have inherited some databases whith extremely large Log files. I tried the truncate transaction log but did not work. Can some body please tell me how to truncate these log files.
I have text data files from a third party and they use comma as field delimiters and enclose the text for each column in double-quotes. Not a problem for most of the data files until they start sending files where there is " within the column values. SSIS package fails with the error:
The column delimiter for column "Column 1" was not found.
Any ideas on how to resolve this issue will be greatly appreciated.Thankspcp
I have some Large flat fiiles that I need to export to my SQL Server database. The file sizes range from 16 MB to 116 MB. I've tried to save the files to an excel sread sheet and then export them in that format, but that didn't work. does anyone have any suggestions?
i have a few tables using Sql Server 2005 Express. currently they are holding roughly 30-40k records in them. i have my log files set at restricted growth to 90 megs. while im not close to reaching that, i would like my tables to be able to scale up to possibly millions of records. based on that, i figure the transaction log file will prolly need to have a higher threshold (unrestricted growth). for those with experience, for tables that have millions of records, what are the average size log files i could expect. is it a bad idea to just shrink the log file every night during off peak hours so that regardless of the amount of records i have, ill always start the day with a minimal log file? do large log files have any effect on SQL performance?
We have SQL Server running on a Windows 2003 server, only because Backup Exec requires it. AT the location : C:Program FilesMicrosoft SQL ServerMSSQLData there is this file: SuperVISorNet_log.LDF which is 15 Gb and is accessed daily. I apologize because I don't know what this is!
My question is: can this file be 'pruned' (for want of a better word) because it's taking up a lot of backup space.
I am trying to run a query that deletes duplicates records on a table with 24m records. The problem is each time I run it the log file fills up and I get an error saying the log file is full. For this reason the query never ends.
Is there anyway to turn of logging when running a query?
I think it also has to do with disk drive runng out of space as the log file is growing to over 12gb.
Hello, I have decided to use Linq for my current ASP.NET project and so far it has been good, but now I am implementing a system that will allow users to upload binary content such as pictures and videos. For ease of management and security, I have decided to store this content directly in the database. The performance hit is a minor concern because very few user-uploaded images/videos will be seen on any given page (usually just one). From the limited tutorials I have seen on the internet, Linq supports the SQL Server varbinary column through its System.Linq.Binary class. This class does not appear to support STREAMS and instead opts to load all of the contents into memory. This content can then be converted to an array of bytes, which can then be output to the browser via the response stream. This is not good. What if I am sending a video that is very large? Varbinary supports up to 2 GB. I can't have a 2 GB video sitting in memory. It makes a lot more sense to stream it via a small buffer. Obviously, I am going to limit the size of the content that users can upload, but the core problem remains. If I limit content size to 2 MB and I have 2 GB of memory on the server, then I can only serve 1000 users concurrently. In reality, that number would be much less because of other processes running on the server. Is there no way to stream data from a varbinary column with Linq using a small buffer of bytes? Do I need to implement some custom logic on my Linq classes? Since these classes are automatically generated, how would I do such a thing? Thanks.
Hi, I have one column which contains some large data more than 30000 characters. When I exported the data to Excel file this, some data in this column showed as #######. This # are giving the problem in SSIS package. If I delete that data(####) then SSIS package is working fine. How to solve this problem.
I have a table that I'm inserting a file into and using the Image data type to store the binary object. Now the code below works fine for files around 1.5 MB, but anything larger and it's like the code won't even execute and I get a Page Not found error. I'm in the process of running some traces to find out what's going on in the backend, but I'm assuming there's something amiss with my code. The Image data type should handle files that size with no problem but for some reason it isn't. Does anyone see anything wrong? Thanks Dim iLength As Integer = CType(File1.PostedFile.InputStream.Length, Integer) If iLength = 0 Then Exit Sub 'not a valid file Dim sContentType As String = File1.PostedFile.ContentType Dim sFileName As String, i As Integer Dim bytContent As Byte() ReDim bytContent(iLength) 'byte array, set to file size
'strip the path off the filename i = InStrRev(File1.PostedFile.FileName.Trim, "") If i = 0 Then sFileName = File1.PostedFile.FileName.Trim Else sFileName = Right(File1.PostedFile.FileName.Trim, Len(File1.PostedFile.FileName.Trim) - i) End If conn = New SqlConnection(eco) conn.Open() cmd = New SqlCommand("INSERT INTO ECO_Attachments (ECOID, FromType, DocName,OldRev,NewRev,NtLogin,DisplayName, FileName, FileSize, FileData, ContentType) VALUES (@ECOID, @FromType,@DocName,@OldRev,@NewRev,@NtLogin,@DisplayName, @FileName, @FileSize, @FileData, @ContentType) ") cmd.Connection = conn Try File1.PostedFile.InputStream.Read(bytContent, 0, iLength) With cmd .Parameters.Add("@ECOID", SqlDbType.Int) .Parameters.Add("@FromType", SqlDbType.NVarChar, 50) .Parameters.Add("@DocName", SqlDbType.NVarChar, 250) .Parameters.Add("@OldRev", SqlDbType.NVarChar, 50) .Parameters.Add("@NewRev", SqlDbType.NVarChar, 50) .Parameters.Add("@NTLogin", SqlDbType.NVarChar, 100) .Parameters.Add("@DisplayName", SqlDbType.NVarChar, 200) .Parameters.Add("@FileName", SqlDbType.NVarChar, 255) .Parameters.Add("@FileSize", SqlDbType.Real) .Parameters.Add("@FileData", SqlDbType.Image) .Parameters.Add("@ContentType", SqlDbType.NVarChar, 50) .Parameters("@ECOID").Value = ECOID .Parameters("@FromType").Value = From .Parameters("@DocName").Value = DocName .Parameters("@OldRev").Value = OldRev .Parameters("@NewRev").Value = NewRev .Parameters("@NTLogin").Value = NTLogon .Parameters("@DisplayName").Value = DisplayName .Parameters("@FileName").Value = sFileName .Parameters("@FileSize").Value = iLength .Parameters("@FileData").Value = bytContent .Parameters("@ContentType").Value = sContentType .ExecuteNonQuery() '.ExecuteScalar() End With Catch ex As Exception Response.Write(ex) 'Handle your database error here conn.Close() End Try
Here's my delema, I have a file that's 308 bytes wide by 5.7 million records. The record length is fixed and the position and width of the known within the record. When I run DTS I recieve this error Msg MS DTS flat file provide and Err Diesdription: error creating file mapping view: not enough storage is available to process this command. Then when I try to continue with the wizard, it will not allow me to separate the data into the format that I need. Is there any other way to import this file using DTS?
Hi my data files sit in the default directories and I think they are causing my partition to run out of space. I mainly use one db that I created but don't use the others (ie master, model, tempdb, etc). Yet I see their MDF and LDF files are growing. What can I do to shrink them down or perhaps move them off to a larger partition after shrinking?
I created a SSIS solution for reading data from dbase and storing them in SQL Server. In a ForEachDirectory-Loop up to one thousand dbase files are read and stored. The system where the packages are running has 16 GB RAM. For the first few hundred dbase files everything goes fine, but then, the RAM seems not to suffice any more and a temp file is created (I changed the path in BufferTempStoragePath).
How can it be that there is a need to create temp files if there is so much RAM available? Why is the RAM filled more and more during the SSIS package execution? Is there anything I can do to release some of it? (it is running in a loop and there is no need to store all the data) Could it be caused by dbase?? (I use Microsoft Jet 4.0 OLE DB Provider)
Another thing is that the temp file is not stored in the path I set in BufferTempStoragePath. There are sufficient permissions set, but temp file is still created in user temp folder...
We have created a databse in SQL Server 7 for support issues. We are having trouble with only one aspect of the databse and that is the body of the supported problem.
The databse is basically a Question/Answer Support database whch house Frequently Asked Questions. When the user does a search on a specific problem they have a list of matching questions shows on the screen (links). When the question is clicked on it shows the Questions with the Answer of possible fixes.
Our problem lies in the fact that SQL server is truncating the Answer portion. I have tried different Data Types with maximum lengths with no success it keeps truncating it. Right now I am on Data Type nvarchar with a length of 4000.
If anyone has any pointers on how to solve this problem all input is appreciated. You can post here to the forum or e-mail me directly at jason@flnet.com
I am building a generic job site and well I have hit a speed bump. I need to store resumes in the database to be searched on. Well what is the best way to store these full text resumes? so that they can be easily searched on?
I'm trying to store a binary data file in my database. I've tried data types image, varchar(max) and text. I don't get error message on loading the data but as soon as the text file exceeds 32,000 bits a query returns an empty data set.
Is this a SSMS display problem and the data is really there? Or is this another one of Microsoft's memory bugs?
I need to create script that will import large XML files (500 - 7GB) on a daily basis and store the data in a relational db structure.
What is the best and fastest way of importing such files. I have played around with smaller files and found the following.
1. SSIS XML Data Source: It doesn't seem to like the complex elements types and throws out the file. 2. Using Bulk File Import, sorting the file in XML variable and using XQuery to parse the file: This works but it can't take a file more than 2GB in size, so I can't use this method. 3. C# + XML Serialization: This also works, but seems to be terribly slow. I open the DB connection once, so it doesn't open and close for each db call, but still seems like it takes a long time.
how to import large XML quickly in a relational table structure?
Hi!I want to store some really big text in my database (for my articles). The approximate size will be from 500 to 40000 characters. I was thinking of using the database 'text' datatype.I have heard that reading these text fields is slower and decreases the performance. Moreover is it advisable to index this for searching purposes?
Hi All, How do I input a large text page (notepad) into a SQL column. Or assign a pointer to the data. I've tried to use BOL (writetext) and to no avail, I guess I'm missing something. I'm just using EM and Query analyzer. I thought this should be easy. Image data should work the same way.
I'm trying to transfer a table from SQL Server 7 database to another SQL server 7 database on another server. This table has a text field with lots of data (~.5-1 G). I'm using the export wizard and the transfer appears to complete successfully, but when I view it, the text field data has been truncated.
I'm importing a large text field from an Excel spreadsheet into my Sql dbase using Enterprise Manager and I'm getting the error message "Data for source column 31 'fieldname' is too large for the specified buffer size." How do I go about changing the buffer size to allow for larger text fields? Thank you.
Does anyone know how to write a stored procedure to insert a large amount of text into a column? SQL server BOL says that you cannot declare a local variable as a text, ntext or image. I need to write a stored procedure that captures a long string of data from a web server and store it in a table. Thanks for your help in advance.
this may seem like a simple question, but I have a report/lease agreement I need to put together and wanted to know the simpliest way to add large amounts of text. Basically its all the legal stuff most leases include in the amount of some 14 pages.
Should this be just one long string-- or does ssrs have another way to format this
I have several databases that have grown to 300 GB and would like to distribute the data into multiple files across multiple drives. Can I create a new database that is spread across the new drives and use a full backup to restore or am I stuck with unloading the data table by table?