Hi,
I run the following command , it went well but i didnt find
a file authors.txt
where will i find this authors.txt
exec master.. xp_cmdshell "bcp pubs..authors out authors.txt -c
-Sserver -Uuser -Ppwd"
Hi, I used the /e in my bcp code. yet did not get all the rows from the main frame into the sql talbes... here is the case I have 11 million rows in an ftp server I use this code to bcp into sql server can anyonecheck if this code is good for the process, I am missing one million row in the bcp process and do not know why??? I put the /e to see if there is any error but could not see any error file in my hard drive? Please check it out and let me know
Can someone please help me with troubleshooting my SQL Server authenticated user. One of the users that I created has a login name of <None>. I am unable to delete this user and recreate his login nor plug in a login name. I believe that there is a script to correct this problem but I am having trouble researching it. I am working under the gun to resolve this issue so anyones help with the issue is much appreciated.
I have encountered an issue with one of my log files which has me somewhat puzzled. I receieved notification at 00:02:16:36 that the Log file was full: The log file for database 'TDS' is full. Back up the transaction log for the database to free up some log space.. A large number of these same messages have followed.
Prior to these messages starting the log file was backed up successfully at 00:00:01:10 Shortly after this there is a message - and this is the bit that confuses me - at 00:02:16:35 that says: d:mssqldata ds_log.ldf: Operating system error 112(error not found) encountered.
So from what I can tell the log file was backed up successfully. Then approx 1 minute later the system can't find the log file??? Then I keep getting these log file full messages - until the next log backup (60 min later)...then everything is fine. HUH?
When looking at the log file for SQL it is sitting there, and isn't new, as it's created date is back when I setup the database initially.
Can anyone shed any light on what this message might mean and whether there is specifically something that I can do to circumvent it in the future.
Oh yeah we have nothing tricky in terms of replication or log shipping or anything like that.....
Recently my server harddisk crashed and I Lost a one of the secondary data files of sqlserver database. I have the main file intact and the recent data was also in that file only, but I was unable to attach the database as it required the other data files also.
Is there any way to create the database again from the existing data file( unfortunately there is no backup also which we can use to recreate the missing data file)
I'm in a BIG BIG trouble... A co-worker tried to help me out with the huge database huge log file. So he first detach my db, and deleted the log.ldf file... now, i tried to attach it back....ERROR... could someone help me out ... please..
I have been trying to find a way to add the File Enumerator Collection option to my version of BIS. Newer versions have a FileEnumerator and I can't seem to find that option. Can anyone help?
hello, I am new to SQL sever and would like to connect to a particular database on the server using SQL. I have looked at various SQL sites with how to and none mention where I can locate the Input File name.
Dear All,SOS Please Help.I have a MS-SQL DB with 4 .ndf files. One (first) .ndf file is missing.somehow got deleted??. Is there any way can rebuild my DB.The .MDF and .LDF files are in tact.Please help asap.Dhumbak*** Sent via Developersdex http://www.developersdex.com ***
I have made trigger on table 'FER' that would be fired if data isinserted, updated to the table. And also, I made batch file using bcpto extract the newly updated / inserted records.But I got missing data in bcp out file like this:Missing 1200 records, blocked at:/*777946 296188 2007-01-29 21:25:45.063778145 296494 2007-01-29 21:25:47.063*/1. trigger.sqlCREATE TABLE [FERUpdate] ([id] [int] NOT NULL ,[fid] [int] NOT NULL ,[sid] [int] NOT NULL ,[UpdatePass] [int] NULL) ON [PRIMARY]GOcreate trigger trgFERUpdate on FER For Insert,Update asinsert into FERUpdate(id,fid,sid) select ins.id, ins.fid,ins.sid frominserted ins2. bcp.bat----isql -U <user-P <pw-S server -Q "update AA..FERUpdate setUpdatePass=1 where UpdatePass is null"bcp "select a.* from AA..FER a, AA..FERUpdate b where a.fid=b.fid anda.sid=b.sid and b.fid<>-1 and b.sid<>-1 and b.updatepass=1" queryout%TFN_NOW%.wrk -U <user-P <pw-S server -f FER.fmtisql -U <user-P <pw-S server -Q "delete from AA..FERUpdate whereUpdatePass=1"-----I have been struggling with this for these two days. Your any helpsare appreciated, Please help me out!! Thanks!!!
So I setup a flat file csv connection with comma as column delim, CR/LF as row delim, etc. Everything works as planned and the package executes.
Now lets say the file comes in wrong and has two columns instead of three. It looks like this. R1C1, R1C2 R2C1, R2C2 R3C1, R3C2
The SSIS file manager reads the file as two rows vs failing. So it reads the above as this... R1C1, R1C2, {LF/CR}R2C1 R2C2, {LF/CR}R3C1, R3C2
This is obviously not right. You get similar issues if they send an extra column where data is just added to the last column. How do you get SSIS to fail or error out if the column count is wrong?
I have a good mdf file that was not shutdown gracefully because ldf file was on hard disk that went bad. Using the ATTACH_REBUILD_LOG statement fails because database wasn't cleanly shutdown. this link http://wiki.servertastic.com/Attaching_a_MDF_file_without_The_LDF doesn't work in 2005, so now what? TIA
Hello,We have a query which returns ~2.8 million rows. This same query isused in a DTS package, which exports to a text file. The number ofrows in this text file, however, is ~2.7 million rows (I'm rounding ofcourse.) So a good chunk of data vanished in the export it appears.Using SQL Server 7.0 on Windows 2000.Anyone see bugs w/ DTS text exports for very large amounts of data?Thanks,DF"Never eat more than you can lift." Miss Piggy
Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 10:01:27 PM Warning: 2007-11-30 22:01:29.52 Code: 0x80012012 Source: Package Description: The configuration file name "C:DevelopmentdtsConfig.xml" is not valid. Check the configuration file name. End Warning ...
But then it goes on to read values from my Production config file. How can I stop the annoying warning about my Development config file-path (which doesn't exist on the Production machine)
I am importing records from a flat file to a database table. If a record is in the table but NOT in the flat file, I need to update a date column in the table.
Logshipping was working fine till last Friday...Looks like there was a cluster node failover during the weekend on the Secondary SQL cluster....From that time, restore log is not working, I am able to see all the log backup files are being copied to the secondary server...its no the transaction log backup share missing problem......I see this message:
The RESTORE statement could not access file 'J:MSSQLBackupDBName_200712081350.tuf'. Error was '2(The system cannot find the file specified.)'
I gooled and found that .tuf file has the LSN information as which log needs to be restored next....I looked at the sql error log and tried to restore the next log manually with "norecovery" option, but still get the same message....Is there any other option other than to remove and reconfigure logshipping ?
I don't know if anyone faced this issue. We are having a strange problem. Our process was working well when it was implemented on 32 bit processor.IT ran perfectly for 6 months with out a problem. But when we moved the packages to a 64 bit machine, this issue along with some other issues started to show up.
The issue is we are missing files in the source folder.
Our process is designed such that a source process, brings in a file and updates a status for the file in a audit table. The ETL process picks up the file, then assigns the status as €˜running€™ when SRC process is complete and loads into Target DB, and updates ETL status to complete. But current problem is the ETL is losing files after it assigns the status as running. When we looked into the DB weather the data is loaded, we could not find any data related to these files.
we are have mapping level parameters for source path and target path.
We are using a For Each Loop task, and processing files(which are simple flat files) in the source path. The file name is stored in the mapping level parameter. Once the file is process we are moving them into a target path.
Our src and target file paths are on the same drive, just have src folder, inside src folder we have processed folder and failed folder. So files are picked from the source folder and moved into processed folder after processing. The files are not even moved to a failed folder.
There are lot other processing going on this box, and the trend observed is that when more processors are running at peak hour, the missing files€™ count is more.
Right now we are refetching those files, as a work around, but does any one has any suggestion why this is happening or any better implementation suggestions?
We had some SAN issues and we dont have Transaction Log files for some databases.. The drive which was holding this Tlog files were missing.. How to bring back databases.
I created a simple SSIS package that downloads a file from an FTP server and does some processing on it. I scheduled it as a job step with the Sql Job Agent. The problem is that this file is not always available for pick up, but when it is I need it very quickly. I'm setting the schedule to look for it every minute. Anytime the file is not there, the package fails and shows up in the job history in red.
Is there any way to prevent an error in this task from registering a package failure?
I can not seem to locate this file any where on my computer let alone in the specified directory. With out this file, it would be very difficult if not impossible to go through rest of the SSIS tutorials. Could I ftp this file from some locations so I could get started on the tutorials.
I'm new to SSIS, and trying to automate data imports from text files. The text files I'm importing always contain a fixed set of columns, or a subset of those columns. If I include a subset of columns in the import file (and exclude others), the data doesn't import...I assume because the actual file doesn't include every column defined in the flat file source object?
Is it possible to accomplish this without dynamically selecting the columns, as indicated here: http://msdn2.microsoft.com/en-us/library/ms136020.aspx
How I set the Transform Data Task Property in SSIS package????
As I designed SSIS.. where I mapped my text file columns to database table columns but if I selected wrong input text file having less columns than database table then how I will come to know that it is wrong input file???? or in the correct file suppose if i have three columns input then at in table i am getting worng values i.e. 1st column of 2nd row is placed in fourth column of previous row in DB table......that is very weird situation
suppose my DB table contain 4 columns and my (wrong) text file contain 2 columns then i should get error message that column003 is not found???? like that happened in DTS 2000
I am trying to send a csv file with 15000 records via the database mail in SQL Server 2014. The problem is that when I open my email the csv only contains 209 records. I have tried the same thing in SQL Server 2012 and it works as expected - it sends the 15000 records in the csv.
I have tested this on several sql servers with 2014 edition on them, and I have the same issue on all of them. The query breaks off at different points on each sever - for example one of them breaks off at 209 records as i said above, another one at 307. The last record always gets truncated at the same place. The csv attachment size it's about 64 kb - which is well below the 4MB limit i've configured the database Maximum File Size bytes parameter.
What i am doing basically is creating a job that is meant to execute a stored procedure and send the results in a csv in an email. The stored procedure is something like:
I have an SSIS package that inserts website URLs from a SQLServer table into a variable used by an HTTP Connection Manager, then downloads the data files from those URLs using a ForEach Loop and a Script Task. Works beautifully when a data file is found at the URL, but hangs if no data file is found. I've set the Timeout property on the HTTP Connection Manager to 30 seconds, but doesn't work. how to first check if a data file exists, of if the request returns nothing, or how to trap this situation in a try-catch?
Here is the VB code I'm using in the Script Task:
Public Sub Main() Try ' Connect to website using HTTP connection manager Dim nativeObject As Object = Dts.Connections("HTTP Connection Manager").AcquireConnection(Nothing) ' Create a new HTTP client connection Dim connection As New HttpClientConnection(nativeObject)
hye everyone, when i browse my web application the eror as below display.. what should i do.. any suggestion or tips...
"Cannot start your application. The workgroup information file is missing or opened exclusively by another user. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.OleDb.OleDbException: Cannot start your application. The workgroup information file is missing or opened exclusively by another user. "