Reliable, Fast Method Of Exporting Data To Sql
Jul 5, 2007
Hi,
For this scenario, what is the best method of exporting data to sql 2005.
I want to export data from desktop app across internet to sql which can do on a row by row basis, but this is very slow and if the connection goes down halfway then pretty much buggered.
What is the best, reliable and fastest way to copy data across internet (several thousand rows), I have read about Bulk Insert etc... but also how would get around an upload and crashes half way, is there a way of uploading and until the whole upload goes through then the data is inserted into the database.
Would appreciate any guidance.
Richard
View 3 Replies
ADVERTISEMENT
Dec 29, 2007
I have a process that restores a backup from a primary server to a backup server daily. When doing the restore, sometime it fails (for various reasons).
I have coded a job to Set offline, set online, an then do the restore:
RESTORE DATABASE [xxx] FROM DISK = N'D:Backup Stagingxxx.bak' WITH FILE = 1, NOUNLOAD, REPLACE, STATS = 10
Sometimes it fails to bring back online, other errors as well. Is there a reliable method of doing this?
View 1 Replies
View Related
May 10, 2007
We have a static class that makes an HTTPWebRequest to get XML data from one of our vendors. We use this as input to a stored proc in SQLServer2005. When I compile this class and call it from a console application in visual studio it executes in milliseconds, everytime. When I compile it, create the assembly and clr function and execute it in SQLServer, it takes around 14 seconds to execute the first time, then on subsequent requests it is again really fast, until I wait for 10 seconds and re-execute, once again it is slow the first time and then fast on subsequent requests. We do not see this behavior when executing outside SQLServer. Makes me think that some sort of authentication is perhaps taking place the first time the function is run in SQLServer? I have no idea how to debug this further. Anyone seen this before or have any ideas?
Here is the class:
Code Snippet
using System;
using System.Collections.Generic;
using System.Text;
using System.Net;
using System.IO;
namespace Predict.Services
{
public static class Foo
{
public static string GetIntradayQuote(string symbol)
{
string returnQuote = "";
HttpWebRequest request = (HttpWebRequest)(WebRequest.Create("http://data.predict.com/predictws/detailed_quote.html?syms=" + symbol + "&fields=1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,28,30"));
request.Timeout = 1000;
HttpWebResponse response = (HttpWebResponse)(request.GetResponse());
StreamReader streamReader = new StreamReader(response.GetResponseStream());
returnQuote = streamReader.ReadToEnd();
streamReader.Close();
response.Close();
return returnQuote;
}
}
}
When I run call it from a console app it is fine.
I compile it into a dll and then create the assembly and function as follows:
Code Snippet
drop function fnTestGetIntradayQuoteXML_SJS
go
drop assembly TestGetIntradayQuoteXML_SJS
go
create ASSEMBLY TestGetIntradayQuoteXML_SJS from 'c:DataBackupsCLRLibrariesTestGetIntradayQuote_SJS.dll' WITH PERMISSION_SET = EXTERNAL_ACCESS
go
CREATE FUNCTION fnTestGetIntradayQuoteXML_SJS(@SymbolList nvarchar(max)) RETURNS nvarchar(max) AS EXTERNAL NAME TestGetIntradayQuoteXML_SJS.[Predict.Services.Foo].GetIntraDayQuote
go
declare @testing nvarchar(max)
set @testing = dbo.fnTestGetIntradayQuoteXML_SJS('goog')
print @testing
When I execute the function as above, again, really slow the first time, then fast on subsequent calls. Could there be something wrong with the code, or some headers that need to be set differently to operate from the CLR in SQLServer?
Regards,
Skipper.
View 1 Replies
View Related
Jan 23, 2004
Hello everybody
We need to move table T1 from database A to T1 database B on same server
size of table T1 15 GB and 40000000 rows
database B just created and will act as warehouse
could it be done simply by
1.creating table T1 on db B and then
2.set db to simple recovery
3.
insert into B.dbo.T1
select * from A.dbo.T1
4. create all the indexes on table T1 in db B
free disk space is 35GB
Any idea how to optimze import
Thank you
View 5 Replies
View Related
Jun 10, 2015
why bcp out (exporting data to a text file from a sql table using bcp utility ) is faster ?
View 6 Replies
View Related
Apr 8, 2006
I am searching for a way to fast load relation data. I know how to load data fast but how can i store relation data fast.
For example :
Table1 ( tabel1Id int identity , name varchar(255) )
Table2 ( tabel2Id int identity , table2Id , name varchar(255))
When i insert 50 records into Table1 i can't get the 50 identity fields back, to insert the related data into Table2.
I think one of the solutions could be returning a selection of
Table1 joined with syslockinfo, but i have no idea how to do it.
Does anyone have an idea?
View 3 Replies
View Related
Jul 10, 2007
Hi Every one,
How can I load or copy say millions of rows to a table in the database faster?
Thanks,
Mejo George
View 6 Replies
View Related
Jul 20, 2005
I'm currently working with a 10 million plus row database with the dataresiding on a Unix box with Cache 5.0. The problems is that it can take fivedays to pull one table from Cache to SQL 2000 using the ODBC connectionprovided by Cache in a SQL 2000 DTS package. I think the real problem isconverting the data from the post relational format (Cache) to a relationalformat (SQL 2000)???Does anyone have any ideas / suggestions on how to speed this transfer ofdata? I'm very new to Cache and any help would be greatly appreciated.Thanks,-p
View 3 Replies
View Related
Mar 4, 2005
Does anyone know how to upload (bulk) data from a client (written in Excel VBA) to a remote SQL2000 database? Of coarse I tried "INSERT INTO" and rst.addnew but I noticed this is much, much slower as downloading from the same remote database.
Thanks.
View 3 Replies
View Related
May 28, 2015
I have below DB structure in MSSQL for a small application which follow relational approach. Data retrieval (for Hostels) will need several Join, may be Key-Value approach where data retrieval will be fast.
Hostels
------------
HostelId,
Name,
Address,
CategotyId,
SubCategoryId,
FoodCategoryId,
LandLordId
Data:
1 H1 Address1 1 1 2 20
2 H2 Address2 1 2 2 21
3 H3 Address3 2 2 1 17
Category
----------
CategoryId,
CategoryName
[code]...
View 10 Replies
View Related
Jul 20, 2005
SQL 7.0I have a form in ASP.NET and I want to write the values to the SQLServer tables.The tables are Customer and Address tables.There will be an insert into the Customer table, and I need to use theIdentity of this inserted record for the Foreign Key in the Addresstable insert.For exampleINSERT INTO Customer (CustomerName)VALUES (@CustomerName)Select @@identity as CustomerIDINSERT INTO Address (Address, CustomerID)VALUES (@Address, CustomerID)My question is this. If I put this into a single stored procedure canI absolutely GUARANTEE that the @@identity value will be from theCustomer table insert, or could it feasibly be from another, as itwere, colliding operation?TIAEdward--The reading group's reading group:http://www.bookgroup.org.uk
View 2 Replies
View Related
Jan 16, 2008
How reliable the SQL Compact 3.5 is?
What I need is to make sure that in case if my Windows CE based device lost power supply my SQL Compact 3.5 database will not be corrupted and I can read data from it.
Has anybody any experience in this area?
This is critical requirement for our project and I want to know it is feasible or not.
View 1 Replies
View Related
Jul 28, 2015
I’m looking for clearity on partition switching. The idea is to use many BULK INSERT statements into table dbo.X_n in parallel and when BULK INSERT for table dbo.X_n is completed, switch dbo.X_n into dbo.bigdaddy. I think this is the fastest way to upload a couple hundred GB of data.
In learning about partition switching (in part) from The Data Loading Performance Guide under Partition SWITCH, I hear the instructions to say copy the main table exactly to become a target. But in that same step (#1), I read that we need to change the default file group of the target (dbo.X_n) from the default file group. Then it says I need to match indexes and lists the filegroup as something we need to match with the main table.
As an overview of the partition switching strategy, I think the whole point of BULK INSERT with partitioning is to have seperate files (in same group) to enable concurrent uploading where each table has its own file. Once the upload is completed to a table (dbo.X_n) then we do the partition switch into the main table (dbo.bigdaddy). The data we just uploaded doesn’t actually move, just the metadata for it.
When I read the instructions linked above, I hear “Don’t have the same filegroup on your target as the main table. You must have the same filegroup on your target as the main table.”
Where am I disconnected?
View 5 Replies
View Related
Mar 5, 2007
Hi..
I have a SQL Server 2005 Database .. I develop for the customers a C# Application , that sends some specific Information to my central Database.
If I would create a connection over tcp/IP (in order to send data to central database) like
sqlconnection1.ConnectionString = "Data Source=192.XXX.Y.ZZ,1433;Initial Catalog=Muster;Integrated Security=True";
it would not be reliable. How can we make that safe?
Thanks...
View 4 Replies
View Related
Jul 28, 2015
I’m looking for clearity on partition switching. The idea is to use many BULK INSERT statements into table dbo.X_n in parallel and when BULK INSERT for table dbo.X_n is completed, switch dbo.X_n into dbo.bigdaddy. I think this is the fastest way to upload a couple hundred GB of data.
In learning about partition switching (in part) from The Data Loading Performance Guide under Partition SWITCH, I hear the instructions to say copy the main table exactly to become a target. But in that same step (#1), I read that we need to change the default file group of the target (dbo.X_n) from the default file group. Then it says I need to match indexes and lists the filegroup as something we need to match with the main table.
As an overview of the partition switching strategy, I think the whole point of BULK INSERT with partitioning is to have seperate files (in same group) to enable concurrent uploading where each table has its own file. Once the upload is completed to a table (dbo.X_n) then we do the partition switch into the main table (dbo.bigdaddy). The data we just uploaded doesn’t actually move, just the metadata for it.
“Don’t have the same filegroup on your target as the main table. You must have the same filegroup on your target as the main table.”
View 1 Replies
View Related
Nov 22, 2002
I have set up two Maintenance Plans to do daily backups overnight for our two DBs that are on our SQL server.
As a separate strategy we want to have a second backup done nightly that would involve
#1 detach the db(s)
#2 copy the mdf(s) and ldf(s) to L:xxx
#3 attach the db(s) again
#4 zip up the db(s)
#5 copy the files to a different server for storage.
I created #1, and #3 in query analyzer and saved the script.
I want the process to run at say 2AM and don't know how to schedule them to run - that is #1 ... #2 ... #3. Anyone have sample scripts for this kind of a backup strategy?
This seems like a very simple process especially for a restore and especially since there would not be any trasaction logs involved. This way if we had a KRASH we could take the backup from the separate server and install it on a warm SQL backup server.
Apart from the new servers name then what other steps would I need to cover to get the apps up and running in the quickest time. The app software runs as a client install - Access 97DB] I'm especially curious if I need to have more than a fresh MS2000 SQL server install.
I'm know I'm asking a lot for a first timer.
It seems we want to handle disaster recovery BEFORE we even have our first system crash.
many thanks
View 11 Replies
View Related
May 5, 2004
Maybe I am just a lot better at this than I thought, but I figure that somewhere there is a mathematical rule that is being overlooked. When I run dbcc sqlperf (lrustats) on some of my production machines, I sometimes end up with a cache hit ratio (which is defined as a percentage, mind you) that is slightly over the limit:
Statistic Value
-------------------------------- ------------------------
Cache Hit Ratio 100.00898
Cache Flushes 0.0
Free Page Scan (Avg) 0.0
Free Page Scan (Max) 0.0
Min Free Buffers 331.0
Cache Size 4362.0
Free Buffers 9434.0
I suspect some counter somewhere is getting wrapped around its 4 byte limit. Is there any reliable source for getting statisics about SQL Server performance? Users tend be unreliable and say everything is slow.
View 2 Replies
View Related
May 10, 2007
Hi,I am trying to write a method which needs to call a stored procedure and then needs to get the response of the stored procedure back to the variable i declared in the method. private string GetFromCode(string strWebVersionFromCode, string strWebVersionString) { //call stored procedure } strWebVersionFromCode = GetFromCode(strFromCode, "web_version"); // is the var which will store the response.how should I do this?Please assist.
View 3 Replies
View Related
Mar 14, 2008
I have been tring to get this one line figured out for a few days now.
'Job2 Info
Dim selectSQL2 As StringselectSQL2 = "SELECT * FROM '" & CompanyKey & "'" '<<-------HERE
Dim cmd2 As New SqlCommand(selectSQL2, con)
'Job2 Select
Try
con.Open()
reader = cmd2.ExecuteReader()
(I have the full code below.) So here is the problem, this code is not populating the datagrid. There is data in the table I am selecting from. When I log in, my CompanyKey value displays in the label as "21". When I take out the "CompanyKey" variable, and just type in 21, the grid is populated. It is confusing the heck out of me. I have tried it this way:
selectSQL2 = "SELECT * FROM [" & CompanyKey & "]"
-and this way:
selectSQL2 = "SELECT * FROM & CompanyKey
and all the other ways I could think of. I researched it and can just not get it to work any way I do it. Any suggestions? Full code below:
____________________________________
Imports System.DataImports System.Data.CommonImports System.Data.SqlClient
Partial Class _Default
Inherits System.Web.UI.PageProtected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
'Database ConnectionDim con As New SqlConnection("Data Source = .SQLExpress;integrated security=true;attachdbfilename=|DataDirectory|ASPNETDB.mdf;user instance=true")
'Job1(Info)Dim currentUserID
currentUserID = Context.User.Identity.Name.ToString()
Label1.Text = currentUserID
Dim selectSQL1 As StringselectSQL1 = "SELECT companyKey FROM Company WHERE UserID = ('" + currentUserID + "')"
Dim cmd1 As New SqlCommand(selectSQL1, con) Dim reader As SqlDataReader
Dim CompanyKey
'Job1 Select
Try
con.Open()
reader = cmd1.ExecuteReader()Do While reader.Read()
CompanyKey = reader("CompanyKey").ToString()
Loop
reader.Close()Catch err As Exception
ReaderError.Text = "Error selecting record."
ReaderError.Text &= err.Message
Finally
con.Close()
End Try
'Job2 Info
Dim selectSQL2 As StringselectSQL2 = "SELECT * FROM [" & CompanyKey & "]"
Dim cmd2 As New SqlCommand(selectSQL2, con)
'Job2 Select
Try
con.Open()
reader = cmd2.ExecuteReader()
GridView1.DataSource = reader
GridView1.DataBind()
reader.Close()Catch err As Exception
ReaderError.Text = "Error selecting record."
ReaderError.Text &= err.Message
Finally
ReaderResults.Text = CompanyKey
con.Close()End TryEnd Sub
End Class
View 4 Replies
View Related
Feb 24, 2007
I am using the following query to export data from sql server to ms access in export data wizard:
SELECT * FROM myView where myID = 123
Order by varcharColumnName1,varcharColumnName2 ,intColumnName3
This query will fetch about 7, 00,000 records.
SQL server 2005 shows the correct order, but Data in access table shows Incorrect data.
Please give me the solutions.
View 4 Replies
View Related
Jun 28, 2012
I have a sql server 2008 backend with an Access 2007 frontend database. Each time I export a query I get the following error:
Code:
Microsoft Access was unable to append all the data to the table.
The contents of fields in 0 record(s) were deleted, and 1 record(s) were lost due to key violations.
*If data was deleted, the data you pasted or imported doesn't match the field data types or the FieldSize property in the destination table.
*If records were lost, either the records you pasted contain primary key values that already exist in the destination table, or they violate referential integrity rules for a relationship defined between tables. Do you want to proceed anyway?
I don't know what if anything is actually missing because of the amount of data is more thant 6000 records. It seems everything exported but I would have to comb through the data to be sure.
View 3 Replies
View Related
Mar 1, 2006
Hi,
Does anyone know if it is possible to point data that underwent the "merge join" transformation (in one data flow) to the following data flow? I don't want to recreate all that merging, sorting and calling the same sources again in the following data flow if the data that I am using exists in the previous data flow. The merged data is simply too big to export to an excel file, so does anyone have any ideas? Thanks!
View 8 Replies
View Related
Oct 24, 2007
I posted a more detailed technical question a couple of weeks ago
with Subject = "Compact Edition 2005 Subscribers - Merge Agent failed after detecting that retention-based cleanup has deleted metadata"
But I got no replies.
Now my question is simpler: Is SQL Server 2005 Merge Replication reliable with Compact Edition subscribers on WindowsXP tablets?
We deleted all users' local databases a couple of weeks ago, and republished two publications with retention = 60 days.
Now we have at least one user who can't sync with the server, and replication monitor shows the error below.
But if she e-mails her sdf file to a support person, they can successfully sync her database.
We're sure that it's not a permissions issue, which is a normal suspicion when one user can perform a task that another can not.
It doesn't seem like a retention problem, because she's well inside the retention period.
She can successfully sync with the publication with upload/download tablers, but not with the publication with download only tables. Both publications were created on the same day, by the same DBA, with nearly identical properties.
This problem user has operated in the past with no problem, and she's configured identically to all of the others. It just seems like random flakyness with merge replication for Compact Edition subscribers.
I feel stupid saying that, because it sounds like I believe in a magical ghost in the machine. So my question is - IS MERGE REPLICATION A LITTLE BIT FLAKY FOR COMPACT EDITION SUBSCRIBERS ON XP MACHINES?
We've applied SP2 to the server, but no subsequent HotFixes.
We're trying to sell users and management on developing a larger occasionally connected system with the same replication scheme, so we need to know if it deserves our confidence.
Error messages:
The Merge Agent failed after detecting that retention-based metadata cleanup has deleted metadata at the Publisher for changes not yet sent to the Subscriber. You must reinitialize the subscription (without upload). (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147199402)
Get help: http://help/MSSQL_REPL-2147199402
View 2 Replies
View Related
May 4, 2007
I have built a WM5 SQLce2005 application using, primarily, table adapters which sparked a discussion whether it is more effiecent to use table adapters or use SQLce command execution with coding for a device.
Are there any papers / threads or thoughts as to the best access method to use?
View 1 Replies
View Related
Jan 29, 2008
Hi,
I just have a Dataset with my tables and thats it
I have a grid view with several datas on it
no problem to get the data or insert but as soon as I try to delete or update some records the local machine through the same error
Unable to find nongeneric method...
I've try to create an Update query into my table adapters but still not working with this one
Also, try to remove the original_{0} and got the same error...
Please help if anyone has a solution
Thanks
View 7 Replies
View Related
Jul 31, 2006
Can you describe the best (or your preferred) method for updating data held in a related table using Visual Studio 2005 and SQL Server.
For example; if you had a stock control system with the product names and current stock levels in one table and all stock movements in and out held in another table, what is the best, fastest, safest and most reliable method of inserting a stock movement and then updating the current stock level?
I have tried a couple of different methods but would really appreciate a wider range of opinions.
Thanks
View 2 Replies
View Related
Mar 2, 2004
I have a large SQL Database im building, in the database there is a table for each user with certain things listed. One problem is, sometimes in a few collums, there will be like 100things listed, instead of one line.
For Example:
Products in that some users will have 1 product like "apples" but others will have many more products.
What is the most effective way of listing the data? Create new tables? Seperate the products with comma's or spaces?
How do I do it, and keep the overall db size smaller?
Thanks
View 5 Replies
View Related
Mar 25, 1999
Below is code that I have aquired from both the SQL6.5 Books online and a VB5 book. The problem is that when I run this code it returns back the error Method or data member not found. OK I have been to Projects References and made sure that both the Microsoft RemoteData object 2.0 and Remotedata control 2.0 where both included in my project.
The Remotedata control works great, however it does not allow me to Add data and Remove Data.
Am I missing a Reference file?
Do I have the correct version of Remotedata objects?
Private Sub Command1_Click()
Dim cn As rdoConnection
Set cn = rdoEngine.rdoenviroments(0).OpenConnection("sqlser ver")
Dim mysql As String
mysql = "select cuscode from customer where cuscode = '1122'"
Dim myqy As rdoQuery
Set myqy = cn.CreateQuery("myqy1", "")
myqy.SQL = mysql
Dim myrs As rdoResultset
Set myrs = myqy.OpenResultset(rdopenfowardonly, _
rdConcurReadOnly)
While Not myrs.EOF
Debug.Print myrs(0)
myrs.MoveNext
Wend
myrs.Close
myqy.Close
End Sub
Thanks in advance.
LoPingKill
loping@inlink.com
View 2 Replies
View Related
Feb 16, 2004
Hi
Whats the best way to display data on company intranet or web
where they can make little choices also ( say between date so and so)
some thing like a report, with charts etc
with some graphical easy to learn language
I have strong knowledge of SQL and ok with HTML
View 3 Replies
View Related
Feb 2, 2007
Hello all,
I'm at a bit of a loss as it seems what I want to do should be obvious but I can't seem to locate whether it is possible. Essentially I would like to render a report using the api/C# and I would like to set the actual data the report renders via an XML string that I construct however I like within my code. Is this not possible? I hope I'm overlooking the obvious here.
Thanks in advance for your feedback.
View 4 Replies
View Related
Dec 27, 2007
Hi, i'm new to reporting services, rather new to heavy processing in reporting services. i have scenario
for which i need your help. so here it goes,
i have a method in Code area of the report, i'm passing parameter values to it. the method willl return me a swl query in string format. i need to execute it in data tab area. the codei have used in data tab is as below. please let me know. wat to do to make it right. thx
EXEC ('Code.Main(Parameters!Param1)' +
UNION
+ 'Code.Main(Parameters!Param2) ' +
UNION ALL
+ 'Code.Main(Parameters!Param3)')
Thx again
i need to get this report done really soon, so please, if u have any idea let me know ASAP
View 6 Replies
View Related
Mar 21, 2001
hi all,
I am getting problem while importing data from excel file.
I am bale to do the same with flat files. But when i do with excel files
its throwing error : format error.
pls help me in this regard.
and How to export data into export files from query analyser..
View 2 Replies
View Related
Feb 1, 2005
Hello everybody,
I was wondering if there is a way to export and recode data at the same time with SQL.
For example I have gender information coded as 1 or 2 in my table and I need to upload the information to a different application that needs M or F. Is there a way to export to a new table and recode at the same time ?
I'm still pretty new to it.
Thank you.
-Seb
View 5 Replies
View Related