I'm trying to take data and push it into a new database.
This data was taken from a database where there were many contraints, FK etc. quite a complicated structure and now sent to us with just the data in the tables with none of these constraints, FK!!
This data now needs pushing into the original database structure, so all the data will live within the constraints, but using an insert SPROC throws some FK errors, what is the best way to push this data into it's structure!
somebody suggested DTS but this doesn't take into account the constraints, also we need to run this at anytime for a number different datasets. Can anybody help ?(if you can decipher my explantion)
Many thanks for reading, please indicate if you need more info.
I'm current copying data tables to other tables that don't, some which don't allow duplication of data, i'm dealing with about 40 tables. the insert errors when i try the full copy process, is there a way of saying copy the data if it's not there?
and if there is matching data ignore....... I have seen there is - if exist() would this work in this work in this case.
the only thing i can think of is to delete all the data from the table which in some case will lead to a loss of data- bad times :)
Im using Sql server Express 2005, in the client place. i want to copy the database with all data and put it in my sql server express 2005 in office. so plz help me how can i do this how can i copy entire data from sql server express and put it in another sql server express
I have just been handed a HUGE dB with 100's of SP, TBLs UDF etc etc.I need to add 1000's of records from a flat file accross dozens ofrelated tbls ASAP! (You know the drill)The dB was built with a lot of Business Logic, Constraints etc.I am a Web APP developer and know a lot of TSQL and can also buildbasic triggers and some pretty complex SP and have built solid dB fromscratch but I usually handle business logic and data intergrity beforeit even gets to the dB (or at least I like to think I do).With that said there is no documentation (of course)and I need toreverse engineer this beast ASAP.I was hoping there is a way to qry the SYS tbls or some tricks to getall this info I am looking for.I use the ALT+F1 short cut on tables all the time and am looking for asolution along those lines.Any help would be great and whom ever gives me the key(s) to unlockingthis mess I will give them a free access to a New RSS Fantasy Footballsite launching this summer! If you're not into FF then you can give itto a friend or something.
I would like to copy my msde database so I can deploy it together with my webpage on another pc. Do I simply just copy the .mdf file or is there another way to extract it?
Hi,My questions is we want to make a copy of a database onto the sameserver while preserving the diagrams, stored procs, etc.We stopped the SQL service and made a copy of the data and log files.We attempted to ATTACH the file copies (after re-naming them), but theembedded file information tells SQL that the database already exists.Our database is a piece of junk, but we must use it. If we don't wantto use the IMPORT/EXPORT Wizard to Copy Objects (because we get someerrors during the transfer), how can we make an exact copy of theDatabase onto the same server while giving the "new" database adifferent name?Thanks:DHRUV
I have and ASP.NET and VB.NET application that needs to copy selected stored procedures from one SQL Server to another SQL Server. The scenerio is that the user will select a source server from a drop down list then the source database from another dropdown listbox which then populates a listbox with the stored procedures in the database. They can select multiple stored procedures from the listbox.
Once selected, they select a target SQL Server from a dropdownlist and then a target database from another dropwdown list. I have a command button that they need to do the actual move but I am not sure how to get the selected source stored procedures copied to the target database. Any ideas would be greatly appreciated.
is there any way that i can copy primary key(PK) value to another field in the same table .. say my PK is MemberID i want to replicate that into other field say SortID at the same time when the primary key is incrementing. I'm a newbie in this field please pardon me if something is wrong in the way i'm asking ...please help me friends i'm struggling since a long time ... Thanks in advance .. savvy
First, I consider myself a novice in Mssql. I've recently began the revision of a website that uses MS Sql Server 2000. In a effort to set up a working environment on my pc, I asked to get a copy of the sql server database. The response was something like "I'm not going to do that!" or "that's too much effort" which left me baffled. Furthermore, the size of the database shouldn't be the issue.
In Access all that is needed is the .mdb file to copy a database. In Sql Server is it that more complicated? My questions are as follows
How does someone copy a sql server database to another pc? Were they expecting that I remotely test with Sql Server? Is that possible or practical? Isn't it prefered to have the db localized?
When running a test against SQL Server 2005 Express most of the connections work fine, but some don't. I get that Exception;
When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)
It seems to me I am reaching some kind of limitation. What can I do to increase the number of connectionst?
Hi, Our app uses an NT4 SQL 2k box and a Win2k IIS box. To fulfill the text searching requirements we've had to use dynamic SQL (created within a large SP). This query searches about 8000 rows in a flat table (which we create and keep up to date using a scheduled SP which trawls the underlying tables). It's a complicated SP, and can return all 8000 rows (I think we're winning the battle to have this capped though). When we get about 50 users hitting the app and doing searches simultaneously, performance drops severely (20 secs+ for a query rather than ~2 secs), and we get a lot of errors - which report that: Source: .Net SqlClient Data ProviderMessage: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. We also get a lot of Invalid Viewstate errors, which i'm totally baffled by, and the odd IIS Access Denied error - as if authentication (NTLM) is timing out. Neither the Web not the SQL machines are being hit very hard - the web server in particular is not under any strain whatsoever. It doesn't appear to be related to connection pooling, from the logged error messages we're getting. Does anyone have ANY suggestions as to where I should be looking?!
I am doing heavy Bulk insert task to one Database.When i checked the Activity Monitor I saw it was suspended and Wait type:PAGEIOLATCH_EX So it stopped itself.Do any experts have good idea about what is going on?
What will be best procedure for the following situation.
Heavy traffic database on daily basis. G growth every day. so Full backups every nights are needed. Vendor recommends not taking Log backups but copy just log files over other location. will this help avoid degrading the performance during business hours.
if i don't take log backups, i am not able to recover Point in time if needed. also log files can then grow faster and then i will have to shrink it more often.
I have a specific job that should be run with a decreased memory usage when the workload is heavy on the SQL Server. This is a heavy job that has no specific requirement when it comes to response time. It is important that the rest of the application shouldn’t be affected with longer response time when this job is running. How can this job bee handled from the application, without having to create a separate batch job.
Just a general question here.. I'm designing a web application that might have 50 million - 100 million rows plus. Basically its a simple logging table each row probably only 24 bytes wide, however I can see it taking quite awhile to execute.
The query is basically a group by, showing the amount of "hits" per day.
Are there any special types of strategies I should implement ? Or is a properly designed structure with indexes likely sufficient (on the right hardware of course)
SELECT COUNT(id) as viewcount from location_views WHERE createdate>DATEADD(dd,-30,getdate()) AND objectid=357 SELECT COUNT(id)*2 as clickcount FROM extlinks WHERE createdate>DATEADD(dd,-30,getdate()) AND objectid=357
But I want to add the COUNT statements, so this is what I did:
select COUNT(vws.id)+COUNT(lnks.id)*2 AS totalcount FROM location_views vws,extlinks lnks WHERE (vws.createdate>DATEADD(dd,-30,getdate()) AND vws.objectid=357) OR (lnks.createdate>DATEADD(dd,-30,getdate()) AND lnks.objectid=357)
Turns out the query becomes immensely slow. There must be something I'm doing wrong here which results in such bad performance, but what is it?
As part of my SSIS package, a list of sites is created that need to be created on a remote machine. let's say 1000 sites. I need to pass this list to a web service so web service sitting on that machine creates these sites for me. MY SSIS package does not run frequently so I can sacrifice time a little bit to get better functionality.
I need to move the sites that are not created (for any reasons) by web service to another table and successfully created sites to another table, so I need to get confirmation for each site from the web service.
Which option is better?
1) Calling web service for every single record (site) and get the confirmation and then based on the confirmation I move the records accordingly. I know this might be very time consuming, but as I said my SSIS package might only run every six months
OR
2) Sending records to a web service in a batch and get the result. I don€™t know how to do this though.
We're experiencing the following error regularly: "I/O error 1450(Insufficient system resources exist to complete the requested service.)"
We are running SQL2000 with 2GB memory, Windows 2000 AS, 4 CPUs. We are using a SAN storage connected by 2 Fibre cards. The databases range from 10GB to 400GB in size for decision support applications.
Although it seems clear that the disk subsystem is causing this error, our hosting party is blaming the application layer for this behaviour.
From the SQL server log: I/O error 1450(Insufficient system resources exist to complete the requested service.) detected during read at offset 0x00000350860000 in file 'O:DATASA_Data.MDF'.. Error: 823, Severity: 24, State: 2 I/O error 1450(Insufficient system resources exist to complete the requested service.) detected during read at offset 0x00000350862000 in file 'O:DATASA_Data.MDF'.. Error: 823, Severity: 24, State: 2
From the event log: [..] dmio: Harddisk37 read error at block 23192007: status 0xc000009a dmio: Harddisk35 read error at block 23192135: status 0xc000009a dmio: Harddisk36 read error at block 23192127: status 0xc000009a dmio: Harddisk36 read error at block 23192263: status 0xc000009a [etc] dmio: Disk Harddisk31 block 23193791 (mountpoint O:): Uncorrectable read error [..]
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
I have used both data readers and data adapters(with datasets) in the projects that I have worked on. I am trying to get some clarification on when I should be using which one. I think I am doing this correctly but I want to be sure I am developing good habits.
As the name might suggest, it seems like a datareader is for only reading data. I have read that the data adapter and dataset are for a disconnected architecture. Or, that they can be used for this type of set up. I have been using the data adapter and datasets when writing to a database and the datareader when reading from a database.
Is this how these should be used? Is the data reader the best choice for reading data? Am I doing this the optimal way from a performance stand point?
......................................................thanks in advance
We already integrated different client data to MDS with MS Excel plugin, now we want to push back updated or new added record to source database. is it possible do using MDS? Do we have any background sync process to which automatically sync data to and from subscriber and MDS?
When I enter over 4000 chars in any ntext field in my SQL Server 2005 database (directly in the database and through the application) I get an error saying that the data could not be updated because string or binary data would be truncated.Has anyone ever seen this? I cannot figure out what is causing it, ntext should be able to hold a lot more data that this...
I have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.
The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?
Is there any performance impact if we enable CDC on OLTP db?
Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?
What is the best way to implement CDC to take incremental changes for reporting.
Hi,This is driving me nuts, I have a table that stores notes regarding anoperation in an IMAGE data type field in MS SQL Server 2000.I can read and write no problem using Access using the StrConv function andI can Update the field correctly in T-SQL using:DECLARE @ptrval varbinary(16)SELECT @ptrval = TEXTPTR(BITS_data)FROM mytable_BINARY WHERE ID = 'RB215'WRITETEXT OPERATION_BINARY.BITS @ptrval 'My notes for this operation'However, I just can not seem to be able to convert back to text theinformation once it is stored using T-SQL.My selects keep returning bin data.How to do this! Thanks for your help.SD
I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below:Â
[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
and also this:
[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.
I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected. It seems that somehow the 3 columns are not read in from the source file? ans alslo fiscal year, fiscal week is not set up up properly in my data destination? anyone faced such errors before?
When I execute the below stored procedure I get the error that "Arithmetic overflow error converting expression to data type int".
USE [FileSharing] GO /****** Object: StoredProcedure [dbo].[xlaAFSsp_reports] Script Date: 24.07.2015 17:04:10 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO
[Code] .....
Msg 8115, Level 16, State 2, Procedure xlaAFSsp_reports, Line 25 Arithmetic overflow error converting expression to data type int. The statement has been terminated. (1 row(s) affected)
is there a step by step paper to get there? here is what i need to consider. I Iwill have many customers that will need their own set of records and access pages "branded for their company" each customer will have many clients. I am hosting this application on a windows 2003 server with SQL 2005 server enterprise.
I am using windows authentication, I have created a username in windows, then i added the windows user in SQL management studio in security, granted "DB Read" and "DB write" and again under the database security tab. still from the web authentication fails. i must be nissing a step or two?
I expect to set up a username for each database as i setup new customers.