DTS: EXCEL TO SQL: Fails To Imports Data Once In A While
Mar 31, 2003
Hi there,
I use Asp.Net Application to upload a Excel file and then a DTS to import data from the file to the SQL2000 and finally to display the read data on the screen.
The DTS starts with setting some variables with the help of Dynamic properties.
On Succes.
DTS rum 2 simultaneous Transform Data task importing data from excel to SQL2000.
This works fine for "most" of the time, but then there are the other times.
One of the TDT(Transform Data task) reads nothing from the excel file. but it can read the data if i upload the same file again right after.
Any kind of input is welcome
Thanx
/bs26
View 1 Replies
ADVERTISEMENT
May 6, 2008
Dear all,,
I need your help,,I'm work in website project using ASP.NET,,I have to register the users of this site,, the users are over 200,,so,,I'm thinking in away to save my time,,All the information of these users are stored in Excel file,,What I want to do is to imports these data from the excel file into a table in my database(SQLserver database),,Could you help in coding by VB.NETThanks in advance,,
View 9 Replies
View Related
Sep 13, 2007
Hello,
I have a problem with the Import of an Excel file with SSIS and hope one of you can help me out.
There is a column with mixed data (format is TEXT) in an excel file and I want to import it as Text (DT_WSTR (255)).
So far everything works fine but some fields like "9760020" imports "9.76002e+006".
My settings so far are:
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=<FileName>;Extended Properties="EXCEL 8.0;HDR=NO;IMEX=1"
In addition I altered the registry entry from Microsoft.Jet.engine.excel
TypeGuessRows to 0 (ImportMixedType = Text)
Has someone got a solution?
Thankx
View 7 Replies
View Related
Mar 26, 2007
hours wasted what else is new with Micro --- crap try to import an excel into a table, longest field (via vba macro report) is 278 receiving DB field is 4000 get "Truncation error, I must stop I am a piece of s*** program " the only thing you can find on the web is make sure you have SQL server 2005 SP2 install it same thing and g** only knows what other problems I have just created by installing another piece of Microsoft magic generated in India or China by the best technologists making at least $5 and hour
so MVPs when is this rediculous situation going to be fixed ?? oh that's right your answer will be "go to VISTA" which won't fix my problem but will probably help your stock situation
MS owes me (conservatively) 100K ... my current plan is to install Linux get up on MySQL and NEVER deal with SHoddy half built pieces of garbage again
so where is the fix ???
when will we see it (short of buying another bloated piece of ... oh i mean Vista)
an addendum
i am trying to import 25,000 rows with approx 20 columns from Excel
i added a first column with an ID .... if I import whole table it dies on row 1852 for truncation problems
if I empty that column all goes in make other exel get rid of all rows except ID and bad row (contains web addresses eg http://www. blah blah)
import that excel it dies on row 2395 in other words the first offending cell was no problem if it was the 2cnd row instead of the 15th row
try to tell me this is not a random bug .... long live MYSQL
View 3 Replies
View Related
May 18, 2004
Hi.
Is there anyway to export the table structures : data type,length,NULLABLE,Description into an Excel file using MS SQL Server?
Or I need to do it manually?
Thank you in advanced.
Sincerely
Agustina
View 2 Replies
View Related
Sep 12, 2007
Hello,
I have a problem with the Import of an Excel file and hope one of you can help me out.
There is a column with mixed data (format is TEXT) in an excel file and I want to import it as Text (DT_WSTR (255)).
So far everything works fine but some fields like "9760020" imports "9.76002e+006".
My settings so far are:
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=<FileName>;Extended Properties="EXCEL 8.0;HDR=NO;IMEX=1"
In addition I altered the registry entry
TypeGuessRows to 0 (ImportMixedType = Text)
Has someone got a solution?
Thankx
View 3 Replies
View Related
Sep 29, 2006
Hi,
Our company wants to allow our customers the ability to import employee data. Each customer's employee data changes depending on things like organisation structure etc. so the format of the data to be imported needs to change. We can do a lot of this with dynamic SQL, but are looking at moving it to SSIS as we think it will save us a lot of pain later on and for other features in SSIS.
We're stuck at a fairly early part of the process as we don't know how to dynamically import our spreadsheet. So far our best idea is to connect to the excel connection as the only step in the data task and then to use a script task on the control task, write code to connect to the excel source (set the excel source to not treat the first row as headers so we can do a 'select top 1 *...' ) then create a dynamic sql command to create the table.
Once this is done we would then have another data task that actually puts the data into the newly created table. This all sounds very difficult though. What are the options for doing what we want to do... have a feeling that we're missing something basic.
Any help appreciated,
TIA, Anthony.
View 4 Replies
View Related
Feb 4, 2008
Hi there,
I was wondering about the stability of SSIS when it comes to importing data on a real-time basis. Let's say you have a scenario where flat files, for instance, will be dropped at random intervals ranging from 1 second to 10 seconds apart and the importer has to import these files immediately.
I would imagine that this is done with a package which runs a loop sniffing the directory forever but I stand corrected on the best ways of doing it.
I'm not too sure whether SSIS is a good idea for this as lots of people have had bad comments on SSIS in real-time in my company but they cannot elaborate on why enough to convince me. I have done some pretty cool stuff and must admit that I am a fan of the technology but dont want to defend it into a corner
Please give me your thoughts on this
Regards
Michael
View 13 Replies
View Related
Apr 28, 2008
I got anywhere from a couple hundred to a hundred thousand records that need to be updated or inserted into their SQL Server 2005 end destination. What are some of the best ways to accomplish this? Right now we are doing it manually through line by line updates and inserts. Would I use BC or some other bulk import tool?
View 5 Replies
View Related
Mar 12, 2007
I'm pretty new to using SQL 2005 Management Studio. Generally speaking, it works pretty much the way I'm used to (using Enterprise Manager) as far as moving data around and designing databases is concerned. But I've been trying to import some data to my local SQL server from an online SQL database and I am getting the most bizarre results.
Basically, it appears to work perfectly, but when the import is finished, there is only one new row in the destination table.
I have tried this with two completely different online databases and I have tried importing using a query and just downloading the table as-is, but whatever happens, I just get one row! All of the databases I'm exporting from or importing into are SQL 2000 - I just happen to be using the SQL 2005 client software.
Have you heard of anything like this?
I'm appending the report to the end of this email. As you can see, it says "success, success" all the way down and it clearly states "71236 rows transferred", but when I get done, there's just one.
Any thoughts?
One explanation is that every new row is overwriting the last new row somehow, but I don't think so because the row that actually gets copied is always the first one in the record set, not the last. Unless they're being transferred in reverse order, I suppose.
I'm pretty stumped and I haven't found any useful blogs or help on the web.
Hope you can think of something because I don't have Enterprise Manager on my computer any more and it's going to be a pain to install it.
Jon
Here's the report:
The execution was successful
- Initializing Data Flow Task (Success)
- Initializing Connections (Success)
- Setting SQL Command (Success)
- Setting Source Connection (Success)
- Setting Destination Connection (Success)
- Validating (Success)
- Prepare for Execute (Success)
- Pre-execute (Success)
- Executing (Success)
- Copying to [inframes].[dbo].[sum_shop_clicks] (Success)
* 71236 rows transferred
Messages
* Information 0x402090df: Data Flow Task: The final commit for the data insertion has started.
(SQL Server Import and Export Wizard)
* Information 0x402090e0: Data Flow Task: The final commit for the data insertion has ended.
(SQL Server Import and Export Wizard)
- Post-execute (Success)
- Cleanup (Success)
Messages
* Information 0x4004300b: Data Flow Task: "component "Destination - sum_shop_clicks" (37)" wrote 71236 rows.
(SQL Server Import and Export Wizard)
View 12 Replies
View Related
Mar 12, 2008
Hello everybody,
I have a problem with RS. I have parametrized report - parametres are dateFrom and DateTo. It is always OK in a viewer, but when I click on export to any format and DateTo is far (about more than 1 month - it generates about 5 pages - about 200 rows) from DateFrom, it asks again for user login in a browser and it doesn't accept my login (but strange is even why it appears). Than I get a message "You are not authorized to view this page".
Where is the problems ? When I choose smaller range, it works OK also in exports.
View 1 Replies
View Related
Jul 20, 2006
Hi All!
I am trying to create a table destination to excel which fails with the following error that " Table cannot be created " System Error in Field definition" where as the same works in the Falt File Destination. What could be the problem????
Thanks
jas
View 1 Replies
View Related
May 20, 2008
I am trying to generate Excel documents from SQL DB 2005 using SSIS packages. I created a schedule, which works fine when it runs every 5 minutes etc, but it stops when I change back frequency to once a day. Schedule works only once, and second day not with following error code:
Code: 0xC0202009 Source: Data Flow Task Destination - Query [22] Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E37. End Error Error: 2008-05-20 10:43:01.46 Code: 0xC02020E8 Source: Data Flow Task Destination - Query [22] Description: Opening a rowset for "Query" failed. Check that the object exists in the database. End Error Error: 2008-05-20 10:43:01.46 Code: 0xC004701A Source: Data Flow Task DTS.Pipeline Description: component "Destination - Query" (22) failed the pre-execute phase and returned error code 0xC02020E8. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 10:43:00 AM Finished: 10:43:01 AM Elapsed: 0.609 seconds. The package execution failed. The step failed.
I will appreciate any idea and help.
View 1 Replies
View Related
May 20, 2008
Hi,
I can use Http connection to Analysis Services 2005 from Excel but when I try to use IIS it fails.
AS and IIS are on different machines and not in the same domain.
In the event log of the server with AS i can see following event:
Event Type: Information
Event Source: MSOLAP ISAPI Extension: \?C:Inetpubwwwrootolapmsmdpump.dll
Event Category: (269)
Event ID: 10
Date: 5/20/2008
Time: 10:17:02 AM
User: N/A
Computer: *******
Description:
The description for Event ID ( 10 ) in Source ( MSOLAP ISAPI Extension: \?C:Inetpubwwwrootolapmsmdpump.dll ) cannot be found. The local computer may not have the necessary registry information or message DLL files to display messages from a remote computer. You may be able to use the /AUXSOURCE= flag to retrieve this description; see Help and Support for details. The following information is part of the event: .
I assume it's permission issue. Excel and IIS must be using different user to connect to AS. I tried with different authentication methods on the iis but without success.
How can I make IIS connect with proper permissions?
Dariusz
View 1 Replies
View Related
Dec 27, 2007
Hello,
I have the following scenario, and my main question is why Excel Formatted file errors out while TXT formatted file does not? Any workaround would be helpful, but I do want to avoid dtexec, and still want to manage it as SSIS.
MAIN ISSUES - permission issue & excel connection issue
SSIS Package Run Directly (not SQL Job, not automated)
* Every format and and folder location works!
Package Run under SQL JOB
TXT Format
A. Project Folder - GOOD
B. Client Folder - FAILS!
Conclusion - TXT Format works, but Client Folder does not have sufficient permission for PRODsvcSQLDMS
EXCEL Format
A. Project Folder - FAILS!
B. Client Folder - FAILS!
Conclusion - EXCEL Format DOES NOT work. Maybe SSIS Excel Connection Bug.
SQL Build 9.00.2153.00
-Lawrence
View 9 Replies
View Related
Jan 25, 2008
I've put together a SSIS package that, once a user uploads an Excel spreadsheet from a webpage, grabs it, does a mess of calculations and spits it out into a datareader (this last part is tricky, but I haven't even gotten to this point yet). In BIDS, the package works fine. Run using the 32-bit version of dtexec, it runs fine. But when I try to call it from the page, I keep getting an error. The errors look familiar enough that I'm thinking it's due to the package trying to run 64-bit, and that not playing nicely with Excel. If that's true, is there an easy way to force the 32-but version to be used? I've already set the project properties to Run64BitRuntime = false, which I'd hoped would help. but no luck.
Here's the code I'm using:
Code Snippet
string pkg = @"pathpackage.dtsx";
Application app = new Application();
Package p = app.LoadPackage(p, null);
Variable varFileName = p.Variables["strExcelFileName"];
varFileName.Vaue = strFileName;
Variable varFilePath = p.Variables["strExcelFilePath"];
varFilePath.Value = strFilePath;
DTSExecResult dte = p.Execute();
Error handling
p.Dispose();
And it kicks out the following:
-1071636471
TestReports_UploadTestReport
Connection manager "Excel Connection Manager"
Microsoft.SqlServer.Dts.Runtime.DtsError0{8BDFE898-E9D8-4D23-9739-DA807BCDC2AC}
SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040154. An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x80040154 Description: "Class not registered".
-1071611876
DFT - Upload spreadsheet to dataset
Excel Source [1]
Microsoft.SqlServer.Dts.Runtime.DtsError0dtsmsg.rll{8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}
SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
-1073450985
DFT - Upload spreadsheet to dataset
DTS.Pipeline
Microsoft.SqlServer.Dts.Runtime.DtsError0dtsmsg.rll{8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}
component "Excel Source" (1) failed validation and returned error code 0xC020801C.
-1073450996
DFT - Upload spreadsheet to dataset
DTS.Pipeline
Microsoft.SqlServer.Dts.Runtime.DtsError0dtsmsg.rll{8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}
One or more component failed validation.
-1073594105
DFT - Upload spreadsheet to dataset
View 3 Replies
View Related
May 29, 2008
Hello,
I've SQL Server running on 64 bit. When I schedule a package to run every night which has excel data transformation, it fails. Initally I had a problem running through BIDS also but then I found in project properties you can switch this off by specifying property Run64BitRuntime by false. After going through some forums, many have suggested that when scheduling a package, modify the command arguments to run as 32bit runtime. These command argumnents can be generated from making use of tool DTExecUI.exe Even after using generated command arguments from this tool, it doesn't help.
Below is my command argument and the error message generated.
/FILE "D:DevDWLoadAll.dtsx"
/CONFIGFILE "D:DevDWdevCongig.dtsConfig"
/CONNECTION "customers.dtsx";"D:DevDWcustomers.dtsx"
/CONNECTION "Addresses.dtsx";"D:DevDWAddresses.dtsx"
/CONNECTION "Counties.dtsx";"D:DevDWCounties.dtsx"
/CONNECTION "Countries.dtsx";"D:DevDWCountries.dtsx"
/CONNECTION "ExcelLoad.dtsx";"D:DevDWExcelLoad.dtsx"
/CONNECTION Test;""Data Source=TestDW;Initial Catalog=TestStagingDW;Provider=SQLNCLI.1;Integrated Security=SSPI;Auto Translate=False;"" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING E
Message
Executed as user: DWUserSQLServiceAcc. ...0.3042.00 for 64-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 00:05:00 Error: 2008-05-28 00:06:02.31 Code: 0xC0202009 Source: Excel Load Connection manager "Excel Connection Manager" Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040154. An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x80040154 Description: "Class not registered". End Error Error: 2008-05-28 00:06:02.31 Code: 0xC020801C Source: Copy Data Excel Source [5052] Description: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed. End... The package execution fa... The step failed.
Any suggestions???
Circuit
View 3 Replies
View Related
May 9, 2006
I have a SSIS package that reads the data from an Excel file using an Excel Connection Manager and imports the data to a table on a SQL Server 2005 DB instance.
When I run this package locally on the server the package being on the file system, the package executes perfectly. Now I upload the package to the msdb database and the run the package from there and the package still executes successfully.
Now I schedule the package to run as a SQL Server Agent job and the package fails and when the logging is enabled I see this in the log file;
OnError,WEB-INTSQL,NT AUTHORITYSYSTEM,Copy to CRN-ALLOCATION_COMMENTS_TEMP,{40A6BF6E-7121-448B-A49D-DED58FDC746A},{BD991566-F4BD-41BC-AEBF-264032D8D0D3},5/9/2006 1:54:52 PM,5/9/2006 1:54:52 PM,-1071611876,0x,The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009.
OnError,WEB-INTSQL,NT AUTHORITYSYSTEM,Copy to CRN-ALLOCATION_COMMENTS_TEMP,{40A6BF6E-7121-448B-A49D-DED58FDC746A},{BD991566-F4BD-41BC-AEBF-264032D8D0D3},5/9/2006 1:54:52 PM,5/9/2006 1:54:52 PM,-1073450985,0x,component "Allocation Comments" (1) failed validation and returned error code 0xC020801C.
I am wondering why the AcquireConnection method call is failing when the package is scheduled? I am running the step as a SQL Agent Service Account and it is the Local System account that starts up the SQL Server Agent and Server and is an Administrator on the box.
Any inputs will be much appreciated.
Thanks,
M.Shah
View 7 Replies
View Related
Feb 29, 2008
I'm created a package to import XML the are PGP encrypted from a FTP Site. What tools can be used to decode the encryption through SSIS?
Is this possible? If not what can be used to automate this process?
View 6 Replies
View Related
Oct 19, 2010
I have a SSIS package which reads an excel file and loads data into a table using script component(C#) as a source. The package runs without any errors when I manually run it on my machine and on the server. But the package fails when run as a SQL Server Agent job.
I tried all the possible fixes I found on the web but still can't get it to work.
View 14 Replies
View Related
Nov 19, 2007
I am using a Excel Source to get the data from an excel file to sql server 2005 table. A couple columns are coming in a double precision float, but some values have characters in them, but those values are coming out as null, even though I changed the datatype from float to unicode string. Any inputs on resolving this will be much appreciated.
Thanks,
Manisha
View 4 Replies
View Related
Jun 14, 2006
l've the following situation,
l've some excel files controlled by Vendor which changing frequently. The only thing does not change is the header name of each column.
So my question is, is there any way to create a new table based on the excel file selected including the column name in SSIS? So that l can use the data reader as source to select those columns l am interested on and start the integration.
Thanks.
Regards,
Yong Boon, Lim
p/s : The excel header is at the row 7.
View 3 Replies
View Related
Sep 18, 2007
Hello,
I have a problem with retreving a excel data through excel source component.
I have source component as Excel Source which will connect to my .xls sheet.
To retrieve the values from the sheet i am using a query as,
"SELECT F14,F3 FROM [Charac Defn & Assgnment$]"
The column F14 is not formatted so that the format of the cell is "General" I have a different type of values in the F14 column such as "PE","PES",15,20,20.00,8888.9999 etc..
While i click on preview button of Excel source it shows only the text values and not the int or decimal values, its returning NULL for those cells. I tried to use convert function, its throwing an error as
TITLE: Microsoft Visual Studio
------------------------------
There was an error displaying the preview.
------------------------------
ADDITIONAL INFORMATION:
Undefined function 'Convert' in expression. (Microsoft JET Database Engine)
Is there any other function to change the format of the cell or i need to some thing else
Please help me how to solve this issue.
View 6 Replies
View Related
May 7, 2015
I have a very small SSAS database with around 35 Mb. I opened it on Excel 32 bits and started dragging fields to a pivot table and it started failing with memory errors. The behavior on the SSAS server was that memory started growing very fast until 8 GB (vm memory total) and then the error is reported in excel.
What might be the issue in such a small database? I would understand in a big database, but not on this one.
View 13 Replies
View Related
May 12, 2004
Looking for suggestions on this one. What I want to do have have a text file that may have any number of rows and cols (with a predefined format) that a user can update or insert into a table. The definition of the row/cols and data mapping etc, has been done, it is the mechanics of actually doing the below I would appreciate help and advice on.
As the user is an 'end-user' (and has no SQL knowledge at all) the text file to import from will be placed in a predefined location and then a small script will be executed from their PC (as it happens, it's a Mac that runs an app that can exec an SQL command on the currently open database) that will in turn run a stored proc which is then reads in (imports or updates) the appropriate tables witht he contents of the external text file.
Sorry the explanation is a bit long winded but if anyone had any practical suggestions and examples, it would be greatly appreciated.
FYI, they are running SQL 2000 on both XP Pro and W2K3 server.
Thanks
Starb
View 2 Replies
View Related
Jul 20, 2005
Hello Specialists !Please help me - i need advice in importing textual data to SQL Server.I am using DTS with a simple process : Text(source)->Connection.I want to increase speed of importing because i have to import 4GB (1000char lines lenght) of data.Do you have any tips for me ?Best regards,Manu
View 1 Replies
View Related
May 14, 2007
I am trying to add an Imports statement to the beggining of the custome code section of my report but I recieve this error. There is an error on line 0 of custom code: [BC30465] 'Imports' statements must precede any declarations.
My syntax looks like this:
Imports SB = System.Text.StringBuilder
Imports System.Enum
public function Test ...
end function
Does anyone know if this is even possible and why or why not?
In relation to that, can anyone answear as to why there is a 32K limit on the code section in reporting services?
View 4 Replies
View Related
Aug 4, 2004
Hi guys,
I need to be able to send a text file with data seperated by tabs to a stored procedure that populates my table.
I am new to SQL and SPROCs that I am not sure how to even start. If you guys have any ideas I would really like to hear them or maybe any on-line docs and examples.
Thanks in advance!
View 2 Replies
View Related
Sep 5, 2007
I have been trying to follow/implement the examples in the following help topics (thanks to Jamie for these links).
Building Packages Programmatically
(http://msdn2.microsoft.com/en-us/library/ms345167.aspx)
Connecting Data Flow Components Programmatically
(http://msdn2.microsoft.com/en-us/library/ms136086.aspx)
The problem I am having is that MainPipe is not recognized as a valid type in my Script task, even though I have the imports statements that are listed in the example. I get the message "Error 30002: Type 'MainPipe' is not defined". The other and related problem is that when I type "imports microsoft.sqlserver.dts", the intellisense offers only two choices: {}Runtime and {}Tasks. I don't see any choice for Pipeline. Can anyone tell what I am missing? It seems to be some kind of configuration/installation issue, but I have no idea how to resolve it. I have tried this on 3 different machines, with both the RTM SQL 2005 standard edition, and with SP2 installed, all with the same result. Any help is appreciated
Here is my code:
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline
Imports Microsoft.SqlServer.Dts.Pipeline.wrapper
Imports Microsoft.SqlServer.Dts.
Public Class ScriptMain
Public Sub Main()
'
Dim package As Microsoft.SqlServer.Dts.Runtime.Package = _
New Microsoft.SqlServer.Dts.Runtime.Package()
Dim e As Executable = package.Executables.Add("DTS.Pipeline.1")
Dim thMainPipe As Microsoft.SqlServer.Dts.Runtime.TaskHost = _
CType(e, Microsoft.SqlServer.Dts.Runtime.TaskHost)
Dim dataFlowTask As MainPipe = CType(thMainPipe.InnerObject, MainPipe)
Dts.TaskResult = Dts.Results.Success
End Sub
End Class
View 3 Replies
View Related
Jul 2, 2014
I've been reading about the "table lock on bulk load" option and TABLOCK hint.
So my understanding is by default only row locks are taken out and other queries can read/write data while the bulk load is going on. However if you were doing parallel bulk loads with overlapping keys from a clustered index then they may block each other.
But if the option is enabled, you can do the parallel bulk loads without blocking because a table lock is taken out, however, other processes couldn't read/write the data until they're all done.
Is that the gist of it? I think I got confused by some misinformation. Don't all those row locks eventually likely escalate to a table lock anyway though?
View 1 Replies
View Related
Nov 9, 2015
I have created a package that will insert new rows into destination1 if the AnID does not exist in Source1.
This uses a data flow task which contains a oledb source, lookup and oledb destination.
Source1
Field Name AnID Acol1 Acol2 Destination1 Field Name AnID Acol1 Acol2
I want to be able to update the destination1. Acol1 and destination1. Acol2 if the Source1.Acol1 or Source1.Acol2 have changed.
To do this, would I need to create separate data flow task that includes a source, lookup and destination.
Or is this possible to build into my insert new records data flow task...
View 3 Replies
View Related
Sep 23, 2007
I want to write a batch file that will do just that. The problem is bcp or bulkcopy never works with my code, it doesn't recognize it. Maybe I'm going the wrong way to do this, but I could use some help. Ideally I'll make the batch to do the importing via bulkcopy or something like it of a text file into a table and use windows scheduler to automate it. Be as specific as you can please, I'm very new to sql server.
View 6 Replies
View Related
Sep 11, 2007
Hi,
I need to import an SQL string from MS Excel 2003 to SQL SERVER 2000.
The string I need to import is composed by 5 different several blocks and looks like:
Code Snippet
CommandLine01 = "USE mydb"
CommandLine02 = "SELECT Block ..."
CommandLine03 = "GO
ALTER TABLE Block...
GO"
CommandLine04 = "UPDATE Block..."
CommandLine05 = "SELECT Block..."
The detail of the SQL string is at:
http://forums.microsoft.com/msdn/showpost.aspx?postid=2093921&siteid=1&sb=0&d=1&at=7&ft=11&tf=0&pageid=1
I am trying to implement OJ's suggestion:
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2117223&SiteID=1
to use multi - batch processing to import the string to SQL SERVER, something like:
Code Snippet
Dim SqlCnt, cmd1, cmd2, cmd3
'set the properties and open a connection
cmd1="use my_db"
cmd2="create table mytb"
cmd3="insert into mytb"
SqlCnt.execute cmd1
SqlCnt.Execute cmd2
SqlCnt.Execute cmd3
Below is the code (just partial) I have, and I need help to complete it.
Thanks in advance,
Aldo.
Code Snippet
Function TestConnection()
Dim ConnectionString As New ADODB.Connection
Dim RecordSet As New ADODB.RecordSet
ConnectionString = "Driver={SQL Server};Server=myServer;Database=myDBName;Uid=UserName;Pwd=Password"
ConnectionString.Open
CmdLine01 = " USE " & myDB
CmdLine02 = " SELECT ACCOUNTS.FULLNAME FROM ACCOUNTS" ...
CmdLine03 = "GO
ALTER TABLE Block...
GO"
CmdLine04 = "UPDATE Block..."
CmdLine05 = "SELECT Block..."
RecordSet.Open CmdLine01, ConnectionString
RecordSet.Open CmdLine02, ConnectionString
ConnectionString.Execute CmdLine01
ConnectionString.Execute CmdLine02
'Retrieve Field titles
For ColNr = 1 To RecordSet.Fields.Count
ActiveSheet.Cells(1, ColNr).Value = RecordSet.Fields(ColNr - 1).Name
Next
ActiveSheet.Cells(2, 1).CopyFromRecordset RecordSet
'Close ADO objects
RecordSet.Close
ConnectionString.Close
Set RecordSet = Nothing
Set ConnectionString = Nothing
End Function
View 7 Replies
View Related