hello,
I am new to SQL sever and would like to connect to a particular database on the server using SQL. I have looked at various SQL sites with how to and none mention where I can locate the Input File name.
I am not sure if this has been asked before but I couldn't find any thread talking about this.
Let's say we have a parameter in the .sql input file called @Start_Date, how can we pass the value of a particular date, for example, "02-28-2007" to @Start_Date via SQLCMD? is it possible?
I'm trying to skip the need to write a simple windows application...if things can be achieved via dos command line, that will keep everything simple!
Hi all, I have the "Northwind" database in my Sql Server Management Studio Express.
In my C:ProSSEAppsSamplesForChapter02Chapter02 folder, I have the following 2 files: (1) ListColumnValues (MS-DOS Batch File) sqlcmd -S .sqlexpress -v DBName = "Northwind" CName = "CompanyName" TName = "Shippers" -i c:prosseappschapter02ListListColumnVales.sql -o c:prosseappschapter02ColumnValuesOut.rpt (2) ListColumnValues (Microsoft SQL Server Query File) USE $(Northwind) GO SELECT $(CompanyName) FROM $(Shippers) GO When I ran the following SQLcmd: C:ProSSEAppsSamplesForChapter02Chapter02>ListColumnValues.bat I got the following "ColumnValuesOut.rpt" with error messages:
'Northwind' scripting variable not defined. Msg 102, Level 15, State 1, Server L1P2P3SQLEXPRESS, Line 1 Incorrect syntax near '$'. 'CompanyName' scripting variable not defined. 'Shippers' scripting variable not defined. Msg 102, Level 15, State 1, Server L1P2P3SQLEXPRESS, Line 1 Incorrect syntax near 'CompanyName'.
I copied these T-SQL statements from a book and I do not know how to correct them. Please help and tell me how to correct these errors.
I am using the following batch file to execute a script that creates a db and all its objects in the local sql express:
sqlcmd -S (local)SQLExpress -i C:CreateDB.sql
This works fine, but I'm wondering if there's an easy way to put the script in the batch file, so users don't have to worry about putting the script in the C drive. I tried getting rid of the i parameter and pasting the script from the sql file into the batch file, but it didn't work.
A simple DTS job I have is giving me fits. It is a straight copy column job from a pipe delimited text file into a table. The input file comes from a mapped drive linked to a shared filesystem on a sun solaris box.
The typical scenario. I run the DTS job to load 8000 rows from the input text file. Job succeeds.
A week later, the text file is updated with 9000 new rows. I run the DTS job with no changes and it loads 8000 rows from last week.
I reboot my Win XP pc and run DTS again. It now loads the 9000 new rows.
I tried mapping to a UNC to no avail.
Is it buffering the old file somewhere? I need help.
current environment: SQL Server 2000 with all latest SP's and patches Windows 2000 Server with all latest SP's and patches Drive 'G' mapped to a shared filesystem on sun solaris via Samba?
I am quite new to SSIS (I was a DTS developer) and I have a specific requirement to validate all incoming data using regular expressions. For each row in my input file, all columns will need to be validated against an expression. We have approx 60 different input files (a combination of xml and text files) with column counts going from 10 up to 120.
Also each input file will contain a footer which will need to be validated for record counts.
I would like the solution to be as generic as possible as the requirements for each file are similar -the only difference being the column names and the expressions to check against. I would really rather not have a different data flow task for each input file type as there are so many.
Can anybody suggest the most efficient/reusable way to do this? What I was thinking of was this:
Split the input file into detail/footer create a temporary table based on the input file type with CHECKconstraints (regular expressions) for each column for each detail record load it into the temp table redirect any failures to another destination e.g. sql error table
I am trying to import a file to a db table from a mainframe system. When I look at it on the PC has some special characters in it. They look like nulls and or tabs. When I try to define the fields in DTS, I only see up to the first special character. I tried to write a quick VB program to strip them out, but when VB is reading the file they get stripped out before I see them in the program. Any help would be greatly appreciated.
Version 2000.How do I do something like the exampleSELECT *FROM OpenDataSource( 'Microsoft.Jet.OLEDB.4.0','Data Source="c:Financeaccount.xls";User ID=Admin;Password=;Extendedproperties=Excel 5.0')...xactionsbut use a .txt-file instead ?I tried building it using Access (that usually works :-) ) and that gives aconnectionstring of:Text;DSN=LinkSammenkædningsspecifikation;FMT=Delimited;HDR=NO;I MEX=2;CharacterSet=850;DATABASE=c: empSourcetablename=link.txtbut I can't seem to "massage" it into working on the sql-server.If I quick and dirty swap 'Microsoft.Jet.OLEDB.4.0' with 'Text' it giveserror:Could not locate registry entry for OLE DB provider 'Text'.tia/jim
I have a data flow task within a For Each Loop Container.
Its reading a Demilited Flat File and inserting into a DB table. The Flat File Source reads a specified folder and picks up all files with extension .txt
Question:
How can I record and save the file names the package picks up and loads it in ?
For eg: If I have under C:Test folder File1.txt File2.txt File3.txt
And the package picks up all the files above and loads them in, is it possible to read the file name its processing ? I have a need to read these file names ( in this example File1,File2,File3 ) and store them in a DB table.
'm trying to import a text file but the primary key column contains duplicatres (tunrs out to be the nature of the legacy data). How can I kick out all duplicates except, say, for a single primary key value?
Well, tha case here is simply that i have a (Suppliers.csv) as an Input. When taking that file, I do some validation on it's rows (Data type validations, Mandatory Fielda validations..etc). When some rows to do not meet the requirments i put in these validations , it is supposed to be directed to an (Errors) Table in my SQL DB.
I want to include the order of the invalid row in the input File (The row which did not pass from the pre-mentioned validations) within the (Errors) Table when i direct the invalid rows to it.
I am trying to transfer 200 txt files into SQL server by using query analyzer. The command is 'Bulk insert [tableName] from 'pathfilename.txt' However, I need to read and modifiy the txt file. I am new to SQL server but I believe there must be some one who is a wizard can do what I want easily.
Thank you for the help in advance!
Here is the raw data layout, which is comma delimited. BDate 1/1/1990 BDate 1/1/1990 BDate 1/1/1990 BDate 1/1/1990 Edate 1/1/2005 Edate 1/1/2005 Edate 1/1/2005 Edate 1/1/2005 Fq D Fq D Fq D Fq D Date R P M E D Date R P M E D Date R P M E D Date R P M E D 1/1/90 1 2 3 4 5 1/1/90 2 3 4 5 6 1/1/90 3 4 5 6 7 1/1/90 4 5 6 7 8 2 3 4 5 6 1 2 3 4 5 3 4 5 6 7 6 7 8 9 1 1/1/05 ...... 1/1/05 .... 1/1/05 ..... 1/1/05 .....
This is the desired output after load into the table, which is tacking each repeating block on top of each other. Date R P M E D 1/1/90 1 2 3 4 5 2 3 4 5 6 1/1/05 ...... 1/1/90 2 3 4 5 6 2 3 4 5 6 1/1/05 ...... 1/1/90 3 4 5 6 7 3 4 5 6 7 1/1/05 ...... 1/1/90 4 5 6 7 8 6 7 8 9 1 1/1/05 ......
I'm wondering if there is any way to get SSIS to notice, in the Flat File Source, that a "Ragged right" text input file has a record that is too short to populate all the specified columns.
I am reading data from a file that is supposed to be fixed length records, but record 193,591 (out of approx. 500,000) is 20 bytes short of the fixed length (60 bytes). So I changed the input to "ragged right" and found that I can thereby continue to read the file, and load the data (after setting the "maximum errors" to a number greater than the initial "1"). (Without this change to "ragged right", every record after the bad one was "out of synch" with the column arrangement -- so they never made it into the database table destination.)
But the "failures" I am now getting are during the Data Conversion step, when I try to convert some columns to integers (from text, in the input stream). And by looking at the data with a "Redirect Row" setting for the Data Conversion step, I am able to see that the Flat File Source is reading "right past the end of the row."
Is there a way to get the Flat File Source to honor the CR-LF record terminator, and decide that some text columns should contain "nothing" (NULL or zero-length strings), rather than the bytes that contain the CR-LF and the initial text from the next record? Can this somehow be noticed as an error condition?
I am using the flat file connection manager. I read in a set of flat files and copy them into a staging database table. I want to ignore anything in the text file past column 266. I can't get this to work correctly. I can't define the last column to end at 266 and ignore the rest. I tried adding an extra column but that did not work either.
I have encountered an issue with one of my log files which has me somewhat puzzled. I receieved notification at 00:02:16:36 that the Log file was full: The log file for database 'TDS' is full. Back up the transaction log for the database to free up some log space.. A large number of these same messages have followed.
Prior to these messages starting the log file was backed up successfully at 00:00:01:10 Shortly after this there is a message - and this is the bit that confuses me - at 00:02:16:35 that says: d:mssqldata ds_log.ldf: Operating system error 112(error not found) encountered.
So from what I can tell the log file was backed up successfully. Then approx 1 minute later the system can't find the log file??? Then I keep getting these log file full messages - until the next log backup (60 min later)...then everything is fine. HUH?
When looking at the log file for SQL it is sitting there, and isn't new, as it's created date is back when I setup the database initially.
Can anyone shed any light on what this message might mean and whether there is specifically something that I can do to circumvent it in the future.
Oh yeah we have nothing tricky in terms of replication or log shipping or anything like that.....
Recently my server harddisk crashed and I Lost a one of the secondary data files of sqlserver database. I have the main file intact and the recent data was also in that file only, but I was unable to attach the database as it required the other data files also.
Is there any way to create the database again from the existing data file( unfortunately there is no backup also which we can use to recreate the missing data file)
Each day I receive a file with a different name. For example, the name is filename_mmddyyyy.txt where filename_ stays constant and mmddyyyy is the date of the file. The file is always in the same format.
I want to build an SSIS where I pass it this file name. I can write a script to generate the correct file name. How do I build the SSIS so it can accept the input parameter and find the correct file to process?
Hi, I run the following command , it went well but i didnt find a file authors.txt where will i find this authors.txt exec master.. xp_cmdshell "bcp pubs..authors out authors.txt -c -Sserver -Uuser -Ppwd"
I'm in a BIG BIG trouble... A co-worker tried to help me out with the huge database huge log file. So he first detach my db, and deleted the log.ldf file... now, i tried to attach it back....ERROR... could someone help me out ... please..
I have been trying to find a way to add the File Enumerator Collection option to my version of BIS. Newer versions have a FileEnumerator and I can't seem to find that option. Can anyone help?
One people created a word input file (15 pages, including check boxes, text boxes, drop down lists...). Is it possible to save data in word input file to SQL table?
If I can select an input flat file via a dialog box, or it is necessary to either hardcode the file name or change the filename everytime to a similar format; &How can a query be run and processed in SQL right after input of a flat file to continue?
Dear All,SOS Please Help.I have a MS-SQL DB with 4 .ndf files. One (first) .ndf file is missing.somehow got deleted??. Is there any way can rebuild my DB.The .MDF and .LDF files are in tact.Please help asap.Dhumbak*** Sent via Developersdex http://www.developersdex.com ***
I have made trigger on table 'FER' that would be fired if data isinserted, updated to the table. And also, I made batch file using bcpto extract the newly updated / inserted records.But I got missing data in bcp out file like this:Missing 1200 records, blocked at:/*777946 296188 2007-01-29 21:25:45.063778145 296494 2007-01-29 21:25:47.063*/1. trigger.sqlCREATE TABLE [FERUpdate] ([id] [int] NOT NULL ,[fid] [int] NOT NULL ,[sid] [int] NOT NULL ,[UpdatePass] [int] NULL) ON [PRIMARY]GOcreate trigger trgFERUpdate on FER For Insert,Update asinsert into FERUpdate(id,fid,sid) select ins.id, ins.fid,ins.sid frominserted ins2. bcp.bat----isql -U <user-P <pw-S server -Q "update AA..FERUpdate setUpdatePass=1 where UpdatePass is null"bcp "select a.* from AA..FER a, AA..FERUpdate b where a.fid=b.fid anda.sid=b.sid and b.fid<>-1 and b.sid<>-1 and b.updatepass=1" queryout%TFN_NOW%.wrk -U <user-P <pw-S server -f FER.fmtisql -U <user-P <pw-S server -Q "delete from AA..FERUpdate whereUpdatePass=1"-----I have been struggling with this for these two days. Your any helpsare appreciated, Please help me out!! Thanks!!!
So I setup a flat file csv connection with comma as column delim, CR/LF as row delim, etc. Everything works as planned and the package executes.
Now lets say the file comes in wrong and has two columns instead of three. It looks like this. R1C1, R1C2 R2C1, R2C2 R3C1, R3C2
The SSIS file manager reads the file as two rows vs failing. So it reads the above as this... R1C1, R1C2, {LF/CR}R2C1 R2C2, {LF/CR}R3C1, R3C2
This is obviously not right. You get similar issues if they send an extra column where data is just added to the last column. How do you get SSIS to fail or error out if the column count is wrong?
I have a good mdf file that was not shutdown gracefully because ldf file was on hard disk that went bad. Using the ATTACH_REBUILD_LOG statement fails because database wasn't cleanly shutdown. this link http://wiki.servertastic.com/Attaching_a_MDF_file_without_The_LDF doesn't work in 2005, so now what? TIA