I have a file that is automatically generated by an external process that will always have the same name + a date stamp.I will say test_(Datestamp).txt
what i am trying to do is use sql to bulk insert this file but i will not know the full file name as the date stamp is also includes time. What my script is doing is taking this file everyday storing its contents temporarily in a table where i then use a trigger to edit and repopulate the table and kick out a new file that another external process uses.
I have a dynamic sql which uses Pivot and returns "technically" variable no. of columns.
Is there a way to store the dynamic sql's output in to a temp table? I don't want to create a temp table with the structure of the output and limit no. of columns hence changing the SP every time I get new Pivot column!!
Hi ... I need to read rows from a large source file , check if the data selected already exists in the destination and if it does, upate the destination , else insert a new row . Now i could use a sp, but that means i have to call it for each row in the text file ... any ideeas on what kind of package can be created /
Hello there,I just want to ask if storing data in dbase is much better than storing it in the file system? Because for one, i am currenlty developing my thesis which uploads a blob.doc file to a web server (currently i'm using the localhost of ASP.NET) then retrieves it from the local hostAlso i want to know if im right at this, the localhost of ASP.NET is the same as the one of a natural web server on the net? Because i'm just thinking of uploading and downloading the files from a web server. Although our thesis defense didn't require us to really upload it on the net, we were advised to use a localhost on our PC's. I'll be just using my local server Is it ok to just use a web server for storing files than a database?
I am trying to create an ssis package with dynamic csv file as output. and out format contains query output.
sample file name:
Unique identifier + query output + systemdate();
The expression is looking like this.
@[User::FilePath] + @[User::FileName] + ".CSV"
-- user filepath is a variable from ssis package. File name is the output from SQL query. using script task i have assigned the values to @[User::FileName] .
When I debugged the script task the value getting properly but same variable am using for Flafile destination. but its not working.
I am really in a great trouble. My requirement is quite complex, as I feel, it may be quite simple for some of you. Here is my problem - Actually I am doing a project, where client has a specific requirement. i.e. he want's to build a query on runtime for selecting particular record he wants, so that we have provided a user interface where he can select any datasource e.g. SQL, Oracle, Excel etc. He also specifies the database name. He is displayed all the tables and columns under those tables. He selects columns from those table boxes and write the query he wants, with where clause, if required. I am saving these query in a table, storing column names in another table where we map those oringinal column names with columns of another table, wehre we want to store actual result set of the prepared query by the user. for examaple .. if user preapare query like - 'SELECT CUST_ID, CUST_NAME FROM CUSTOMER WHERE CUST_ID = 10' In case of above query I will store CUST_ID in column COL1 of table TAB1 and CUST_NAME in COL2 of table TAB1, like that we have such fifty columns in that table. Now question is here every thing is dynamic data provider, database, tables, columns and where clause, then how can get the result set out of those queries and store that query output in the that storage table with columns col1, col2 and so on, upto 50 columns. Please help me on this, as it is so urgent. I will be very much thankful to you people.
I have a requirement to load bulk of csv files to sql table. some times, some columns could not come in csv file(some times 100 columns and some times 80 cloumns). That time the package is getting failed. How to create a table dynamically based on csv file structure.
We are using a VB component which utilises a FILE DSN to connect to SQL but we have noticed that when using SQL Authentication in setting up the DSN in the control panel the password does not get saved. So the only way we can connect is to have a user with no password i.e blank.
we only noticed this when we tried to use a user other than sa which happened to have a password that is blank.
Has anybody else found a solution to this problem ?
In SQL 2005, what is the best way to take a text file and store it in a table field, then later extract that file to a directory with original name and format intact?
I have records with 50 plus fields of data.I was thinking of writing all the fields data into a XML file and then store it in a SQL server database for retrieval later on.Is there anyway i can go about doing this?
How do I store a sound/wave file in SQL Server. Does SQL Server have the facility to store sound files in the database, if so what data type should be used and functions related to it. Thanks Paul
Does anyone know whats wrong with my code? ~Nothing wrong with my code, code is correct!
I let it run and it keeps saying that the file does not exist. Seem as though, the path of my file doesn't exist! Any suggestions on how to program the path of my file so I can be able to retreive it such as open the file and access the file?
Private Sub Button1_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles Button1.Click
Dim SQL As String
Dim rawData() As Byte
Dim RowsAffected As Integer
Dim filename As String
filename = "H:My Documentsabc.pdf"
Try
If System.IO.File.Exists(filename) Then
Dim filePermission As New System.Security.Permissions.FileIOPermission(System.Security.Permissions.FileIOPermissionAccess.Read, System.Security.AccessControl.AccessControlActions.View, filename)
filePermission.Demand()
Else
MessageBox.Show("file does not exist", "File", MessageBoxButtons.OK, MessageBoxIcon.Information)
End If
Dim fs As FileStream
fs = New FileStream(filename, FileMode.Open, FileAccess.Read)
I have a package that processes a flat file, which is generated by a separate system each weekday around 3:30AM. I would like my package to store the timestamp of the input file and compare it to the timestamp of the next day's file. If they are the same, then the package can exit without reprocessing a particular file again.
So, my first question is whether a package variable can be updated so that its value will persist from one run to the next?
Second, what is the syntax to retrieve the file's timestamp, either directly from the file connection manager or within a script task?
I am trying to create a file DSN to link a password protected access satabase to SQL server anywhere backend. I create the DSN using Microsoft Access Driver, Select my access database, go to advanced properties and type in the database password (user ID is admin). However, everytime I change records in the SQL front end, I am prompted to enter the Access database password. It doesn't seem to be storing it.
hi all... i want to store the output of an sql query as a text file.this is for my project i am doing in java.can u help me how to do this..it is very urgent
Hi, i am using command files(.cmd) and osql utility to conneft to a database. following is the requirement. I have a query "SELECT top 1 au_fname FROM pubs..authors".
Can i store the value of au_fname in the command file?
"[Flat File Destination [13]] Error: Data conversion failed. The data conversion for column "SDATA" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.". " "[Flat File Destination [13]] Error: Cannot copy or convert flat file data for column "SDATA". " "[DTS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Flat File Destination" (13) failed with error code 0xC02020A0. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure. " "[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC02020A0. There may be error messages posted before this with more information on why the thread has exited. " "[DataReader Source [207]] Error: The component "DataReader Source" (207) was unable to process the data. " "[DTS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "DataReader Source" (207) returned error code 0xC02090F5. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure. " "[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "SourceThread0" has exited with error code 0xC0047038. There may be error messages posted before this with more information on why the thread has exited. "
I am selecting data from an OBDC then copying it to a text file an I always get this error, but when I change the destination to a excel file it works perfectly. But the whole point of the package is to copy to a text file.
Please could you help me, your replies will be greatly appreciated.
I am using FileUpload control in ASP.net 2.0 to upload files and store them into SQL server database as an image. I am fine with MS office files, image files and etc. We have Product Center (Proe) engineering software to configure parts and the files generated through this software have PLT extension when I store these files, the file type is Plian/text. I am using Fileupload.PostedFile.Content to get the file type. How can I store PLT, TIF files in SQL server? Please help.
I am looking for some microsoft recommendation on storing images in SQL Server Vs File System. Here is what our requirement is.
Currently we are using SQL Server 2000 with a VB6(COM+) & ASP front end. Our architecture involves Log Shipping & Replication. We use log shipping for our DR site and Replication to replicate to Datawarehouse & DB2 enviornment. I have been trying to research this topic and have come up with opinion favoring both sides mostly file system but nobody seems to point any Microsoft recommendation.
With our application we are looking for Transactions, backups, Log shipping & replication with these images. I have read MS implementation of TerraServer and there they recommend storing them in SQL Server if above is required. (https://www.microsoft.com/technet/prodtechnol/sql/2000/reskit/part3/c1161.mspx?mfr=true)
Can somebody please point me to some Microsoft recommendation and what does SQL Server 2005 offers in this aspect.
Dear all I create this html file on the fly in my asp.net application abd what i would like to do is to store it inside my sql2005 database. What would be the best way?The html file itself is not really big. Probably not more then 600 - 800 characters most. I was thinking the text type fields of the database and then when retreiving it dump it inside in a file and save the file with html extention. Are there any better sugestions? Thank you for your time
Hello, We have some queries that are long and intensive. We have thought about running the queries and storing the data in a text file for lookup from our website.
Example: Our online store only displays items that are in stock so when a user selects a category a query runs and grabs only items that are in stock and then displays them. There could be thousands of items the query needs to sort through before displaying the items that are in stock. What if we ran this query once every hours an stored the results in a txt file? The asp page would then go to the text file to grab the results instead of having to run the query every time a user selects a category. Will this speed up the site by not having to query every time? Would this be a correct way to eliminate queries that run thousands of times a day?
Need to know how I can get the dynamic filename created in the FlatFile destination for insert into a package audit table?
Scenario: Have created a package that successfully outputs Dynamiclly named flat files { Format: C:Test’Comms_File_’ + ‘User::FileNumber’+’_’+Date +’.txt’
E.g.: Comms_File_1_20150724.txt, Comms_File_2_20150724.txt etc} using Foreach Loop Container :
* Enumerator Set to: “Foreach ADO Enumerator” with the ADO object source variable selected to identify how many total loop iterations there are i.e. Let’s say 4 thus 4 files to be created
*Variable Mappings : added the User::FileNumber – indicates which file number current loop iteration is i.e. 1,2,3,4
For the DataFlow task have a OLDBSource and a FlatFile Destination where Flat File ConnectionString is set up as:
I was wondering if there is a way to 'Move File' with the File System Task inside of a For Each Loop container but to dynamically set the Destination path variable.
Currently, this is what I have: FileDestinationPath variable - set to C:TestFiles FileSourcePath variable - set to C:TestFiles FileNameAndLocation variable - set to blank
For Each Loop Container Iterates through a folder C:TestFiles that has .txt files in it with dates in the file name. Ex: Test_09142006.txt. Sets the file path (fully qualified) to the Variable Mapping FileNameAndLocation.
Script Task (within For Each Loop, first step) Sets the FileDestinationPath to the correct dated folder within C:TestFiles. For example, if the text files I want to move are for the 14th of September, it takes FileDestinationPath and appends the date folder to the end of it. The text files have a date in the file name (test_09142006.txt) and I am picking this apart (from FileNameAndLocation in the For Each Loop) to get the folder date. (dts.Variables(User::FileDestinationPath?).Value = dts.Variables(User::FileDestinationPath?).Value & ? Month & _? & Day & _? & Year & ?) which gives me C:TestFiles 9_14_2006?.
File System Task (within For Each Loop, second step) This is where the action is supposed to occur. I want it to take the FileDestinationPath and move the FileNameAndLocation file (from the For Loop) into this folder for each run.
Now as for my problem. I want this package to run everyday but it has to set the FileDestinationPath variable dynamically according to that days date. Basically, how do I get this to work since I cant hard code the destination path variable from the start? I have the DestinationVariable on the File System Task set to the FileDestinationPath variable, after the script task builds it. However, using FileNameAndLocation as the SourceVariable on my File System Task tells me that the Variable FileNameAndLocation? is used as a source or destination and is empty.?
Let me know if I need to clarify further...I may be missing something very simple. Any help would be greatly appreciated!
I want to store a Zip file as a BLOB, but I get an error: "File 'C:<path of mdf file> ' appears to have been truncated by the operating system. Expected size is 2560KB but actual size is 1536KB " whenever the BLOB exceeds 1MB.
Any suggestions? How can I store larger .ZIP files to the Database? I am using MS SQL 2005 Express and the data type that I gave for the column to store Blob is "varbinary(MAX)".
I am inserting byte array in to this field. It works fine for a zip file less than 1MB but as soon as the zipped file size increases beyond 1MB, lots of error pop-up and then the database is not readable. Its says that, the data in the data base may have been corrupted...
I have a source files folder where the files generated everyday. My goal is pick the latest file and copy this single file to another folder. I used the Foreach loop container and got the latest file and stored the file name to a varible i.e. LatestFile Then i want to use the File System Task to copy this to the destination. On the beginning, I could not setup the Latestfile since I don't its name then, so when I setup the Source Connection property of the File system task, it is not allowed to leave the SourceVarible as blank!