Hello,
I have not been able to locate information on the following problem. The first step I have in a packge (Execute SQL Command) is to delete the data from an MS Access database table. The package hangs at this step after all validation is complete. In the package, once the table data is deleted, it is repopulated in a later step. The deletion step and the repopulation step use the same connection manager.
There is no information in the log about an error. At the time the package ran, there was a lock file on the database with about six users connected. I'm not sure what version of Access the database was created in, but I have 2003 on my machine, and I cannot open the database.
i am using this statement for deleting a single row in sql table. "DELETE FROM Random WHERE NewID= '" & strwinner & "'"
where "strwinner" is the variable which contains the row to be deleted. the problem is that when i check the table in sql the row which was supposed to be deleted is sitll there.it does not give me any error statement or something. iam executing this statement by using ExecuteNonQuery in my .aspx page. please help
One of my customer is asking to delete 95% of the rows in an table which has around 645708 rows in a table.Now my concerns are that , what criteria we need to follow while deleting huge records in table, what are the steps we need to be taken care of?And after deletion, what are the maintenance tasks we need to perform to be up to mark without any issues.Lastly will deletion of 95 % of rows improve performance of a table?
We have an employee table that contains bank details and are experiencingproblems with account numbers being erased and lost. In order to track downwhy this is happening (either due to our application code or SQLreplication) we'd like to be able to prevent certain columns from beingdeleted if they already contain some data.Is it possible to setup a check constraint to prevent our ee_acct_no columnsfrom being set to NULL or blank strings if it contains an account number(i.e a 9 digit number)? We have setup the column to allow NULL's as we don'talways know employees bank details until later, so we do need to put them onour database without bank details initially.Also, if possible, can someone suggest a stored procedure or trigger i couldcreate that would fire a user-defined error message that would email anoperator if a bank account number changed?Many thanksDan Williams.
how can we delete parent table as well as child table using a single query applied on parent table, can someone please help me onn this topic? it will be very nice of you guys.
I have a table in sql server 2000 which has over 94000 records. I have to delete a record from table ,which record having a language other than english . I need to clean the table by removing all the data which are in other language . My main table has 12 fields .
We have been asked whether it is possible to completely delete sensitive data. When data is deleted it is not actually completely removed from the database but is marked as deleted. It is possible to zero out that data using sp_clean_db_free_space but this doesn't affect the transaction log.
We had hoped that CHECKPOINT would clear the data from the log but we are not completely certain of this. So the question becomes is there a built-in command or function in SQL Server 2008, or subsequent versions, that will completely remove deleted data, both from the transaction log and the database itself? Or is there a 3rd party tool to accomplish the same thing?
I have a dts which creates a table which is utilized on my localintranet. The DTS runs without error and the table iscreated/populated/transfered to the appropriate db. Then it appearsthat there is an action on this table which truncates it. I have beenunable to determine the culprit. Can I create a trigger that willcapture truncation? I have tried to create a trigger to capture thisinformation but none that I attempt seem to work on capturing atruncation or a drop table and re-create.Any help would be greatly appreciated.MT.
We have several database linked via merge replications. Due to business requirements, we need to delete 5M rows in one table, we did it on one subscriber. However, the publisher kept uploading the deletion operations from the subscriber and blocked any downloading operation from publisher to subscriber. How can we acceralte the replications now as this has already operated in 2 days, and will continue 1-2 days? Is it possible to set the publisher take the downloading before uploading? How to speed up large amount data deletion operations in replication environment?
hi all, is there any query to move certain data from a sql data to access table through query. i am having a requirement where i have to fetch the records from a sql table that falls within a specified range to a ms access table. is this possible through a query.
I have two database(MYDB1 , MYDB2) on two different server's(SERVER1 , SERVER2) . I want to create an store procedure in MYDB1 on SERVER1 and get some data from a table of MYDB2 on SERVER2. How can i do this?
how I can load the CSV file data into the sql server table. I know there are ways like bulk insert and other to load the csv file data into the table. But in my case the table doesn't exist and has to be created at the run time. With simple insert in temp table we do like select * into #temp from tablename and that creates the temp table. So. I need something like that which create the temp table and load the data into it. because the CSV file would have different number of columns and names so I can not create the table structure in advance. I have to create the table at run time.
publisher - sql2000 sp4 distributor - sql 2005 sp1 subscribers - mix of sql2005 sp1 and sql 2000 sp4 we have a database with around 300 tables and replicate it to around 10 different subscribers. some in the same datacenter and others in different offices. We have around 50-60 different publications for this. Some have 20 or so tables others have only one table if they are large tables. Tables range in size from a few hundred rows to over 20 million. Some tables replicate a few commands, others 100000 or more commands on a daily basis. Around 6 weeks ago we started having problems with one table. It's 1.4 million rows and replicates around a few thousand commands on a daily basis. We saw a backlog of around 150000 to 400000 commands. We had some replication issues at this time with this distributor and we traced these problems to memory errors and the replication job would not start or stop. After restarting the job it pushed all the commands through to the subscribers. A few weeks ago we started noticing another backlog again and this time it wouldn't clear. A few times we just ran another snapshot at night but then we started to investigate. We noticed that some people were doing mass updates of 30000 or more rows and this time they weren't going through. It would just build up and cause blocking on the subscribers along with spiking the CPU to 50% or more on a constant basis. As soon as we turn off the replication for this publication everything goes back to normal. Meanwhile all the other publications are happily replicating tens of thousands of rows with no problems. One thing I noticed is that these updates cause a command to transaction ratio of thousands to one where everything else is less than 10 to 1. I thought this was the cause but then the senior DBA updated around 20000 rows in another publication in one update and it went through within a few minutes. Microsoft is absolutely no help. They are looking through logs and they tried to tell us to upgrade our hardware but we have some ancient subscribers for other busy publications with no issues. Right now we are thinking that it might be corruption on the publisher table and we'll be creating a new one this weekend. Any other ideas?
Hello, I have an SSIS package that contains a source to DB2 using the Native OLE DBIBM OLE DB Provider for DB2 in the connection mananger. The connection requires a specific user name and password. 'Test Connection' passes, and I am able to preview data in the OLE DB Source Editor.
The data then goes through a Data Conversion transform. The OLE DB Destination editor uses a Native OLE DBSql Native Client connection manager. 'Test Connection' passes. The SQL Server edition is 2005 Enterprise x64.
When the package is executed within Visual Studio Professional on a Windows XP (sp2) machine, The package hangs at this dataflow. I set up a data viewer between the DB2 source and the conversion transform, but no data appears(DefaultBufferMaxRows is set to 100). In the 'Execution Results', validation completes, but nothing happens beyond that. There are no errors or messages in the error list. In the Control Flow, the dataflow control appears yellow, and stays that way.
I have noticed that when this package runs, a command window flashes up for a brief second.
How do I troubleshoot this? Why is there no data in the data viewer between the source and the data conversion transform?
I have a simple data control task that has an OLE source and OLE target.
the source is a SQL query that returns 200m records this is then written straight out to a table. I have used a data flow task so that I can chunk up the inserts rather than using a INSERT INTO.....SELECT FROM.
I have ran it multiple times and it hangs once it reaches 18,961,020 records.
there is no locking on the database and I have even restarted the SQL instance to ensure that there was nothing else contending for resource.
I have changed gthe buffer size and rows per buffer to 100m and 100,000. Now it hangs before the 18.9m mark presumably becuase of the increased buffer size.
I notice that the SELECT statement continues to clock up io and cpu cycles but the BULK insert process has gone.
I am trying to find a way to look at the size of each table in a database and the last time each table was updated/accessed by a user. I was just given control over a DB that is VERY BADLY maintained and I want to look at what I can get rid of in it. I want to start deleating by size and last used. I can find creation dates for the tables and row counts but not total size and last update/access of the tables. Does anyone know how to get this information? Thanks, Nathan
Hello, I have a Data Flow Source that uses a SQL Command to pull data. In the SQL statement, I used CAST to change all varchar types to Nvarvchar to suit MS Access. I can preview the data from the source. In testing, the SQL statement only pulls about ten records.
I have a Microsoft 2000 Access database table as a destination. Data in each column in the table is required, and all columns have defaults.
I also have a grid data viewer set up. I have the DefaultBufferMaxRows set to 2 so that I can see data going across. When I execute this dataflow, no data is transfered to the Access database table. No data shows up in the dataviewer. There are no errors. The 'Execution Results' tab does not show errors, but indicates that zero rows were transfered. There are no warnings.
How do I begin to isolate the problem? The following is the SQL Statement in the Data Flow Source. Thank you for your help! - cdun2
DECLARE @CategoryTable TABLE (ColID Int, ColCategory varchar(60), ColValue varchar(500) ) --and fill it INSERT INTO @CategoryTable (ColID, ColCategory, ColValue) SELECT 0, LEFT(RawCollectionData,CHARINDEX(':',RawCollectionData)), LTRIM(SUBSTRING(RawCollectionData,CHARINDEX(':',RawCollectionData)+1,255)) FROM Collections_Staging --Assign an ID to each block of data for each occurance of 'Reason:' DECLARE @ID int SET @ID = 1 UPDATE @CategoryTable SET [ColID] = CASE WHEN ColCategory = 'Reason:' THEN @ID - 1 ELSE @ID END, @ID = CASE WHEN ColCategory = 'Reason:' THEN @ID + 1 ELSE @ID END --Then put the data together SELECT --cast to Nvarchar for MSAccess a.ColID, CAST(a.ColValue as Nvarchar(30)) AS OrderID, COALESCE(CAST(b.ColValue as Nvarchar(30)),'') AS SellerUserID, COALESCE(CAST(c.ColValue as Nvarchar(100)),'') AS BusinessName, COALESCE(CAST(d.ColValue as Nvarchar(15)),'') AS BankID, COALESCE(CAST(e.ColValue as Nvarchar(15)),'') AS AccountID, COALESCE(CAST(SUBSTRING(f.ColValue,CHARINDEX('$',f.ColValue)+1,500)AS DECIMAL(18,2)),0) AS CollectionAmount, COALESCE(CAST(g.ColValue as Nvarchar(10)),'') AS TransactionType, CASE WHEN h.ColValue LIKE '%Matching Disbursement%' THEN NULL ELSE CAST(h.ColValue AS SmallDateTime) END AS DisbursementDate, --COALESCE(h.ColValue,'') AS DisbursementDate, CASE WHEN i.ColValue LIKE '%Matching Disbursements%' THEN NULL WHEN CAST(LEFT(REVERSE(i.ColValue),4)AS INT) > 1000 THEN CAST(i.ColValue AS SmallDateTime) WHEN LEFT(REVERSE(i.ColValue),4) = '1000' THEN NULL END AS ReturnDate, --COALESCE(i.ColValue,'') AS ReturnDate, COALESCE(CAST(j.ColValue as Nvarchar(4)),'') AS Code, COALESCE(CAST(k.ColValue as Nvarchar(255)),'') AS CollectionReason
FROM @CategoryTable a LEFT JOIN @CategoryTable b ON b.ColID = a.ColID AND b.ColCategory = 'Seller UserId:' LEFT JOIN @CategoryTable c ON c.ColID = a.ColID AND c.ColCategory = 'Business Name:' LEFT JOIN @CategoryTable d ON d.ColID = a.ColID AND d.ColCategory = 'Bank ID:' LEFT JOIN @CategoryTable e ON e.ColID = a.ColID AND e.ColCategory = 'Account ID:' LEFT JOIN @CategoryTable f ON f.ColID = a.ColID AND f.ColCategory = 'Amount:' LEFT JOIN @CategoryTable g ON g.ColID = a.ColID AND g.ColCategory = 'Transaction Type:' LEFT JOIN @CategoryTable h ON h.ColID = a.ColID AND h.ColCategory = 'Disbursement Date:' LEFT JOIN @CategoryTable i ON i.ColID = a.ColID AND i.ColCategory = 'Return Date:' LEFT JOIN @CategoryTable j ON j.ColID = a.ColID AND j.ColCategory = 'Code:' LEFT JOIN @CategoryTable k ON k.ColID = a.ColID AND k.ColCategory = 'Reason:' WHERE a.ColCategory = 'Order ID:'
I have a SSIS (CTP June 2005) package with several data flow tasks. One data flow has 2 OLEDB data sources which are then unioned together, followed by a conditional split, then a derived column transformation, which feeds into an OLEDB destination. When I run the package, this particular data flow seems to run fine and then just stops. There are no errors and the records counts don't move. I've used data viewers to look at the data and it seems fine. I've switched the OLEDB destination with a flat file and execution runs fine, then switched back to OLEDB and it hangs again (over and over). There's nothing unusual about the OLEDB destination, and all the other OLEDB destinations work fine. It also seems to hang on the same destination row number count, but again that row has been verified as valid. In fact, I dropped the DB table and recreated it with no constraints and all fields nullable, but the problem persists. Help?
Hi! I am developing a CF 2.0 application for WM 6.0. In the application I'm doing a replication between a Sql Server 2005 database and a Sql Server 2005 Compact Edition database. When I'm trying to syncronize the databases for the first time, e.g., creating a new database with AddSubscription(AddOption.CreateDatabase)I cannot do a save afterwards the synchronization procedure. The synchronization works just fine and I get the right data to my device and so, but when I try to a save, the database hangs doing the commit().
If I'm on the otherhand restarts the application and then do a ReinitializeSubscription(true), e.g., doesn't not create a new database with AddSubscription(AddOption.CreateDatabase), and calls Synchronize(), everything works just fine. Anyone who has an explanation of this? (I do a Dispose() each time).
How do I get a System Table like 'Sysobjects' into the Data Access Layer? My app generates tables on the fly, and has to check in the sysobjects table which tables are present.
i am using asp.net vb, i have 2 table show as below if i want to delete the forumid 1 row,then how would i delete the topic table who belong to the forumid 1. how would 1 do it if i am using gridview Forum table Forumid | Forumname 1 | hi 2 | me Topic table Topicid | Forumid | Topicname 1 | 1 | yo 2 | 1 | everyone 3 | 1 | google
I have an application with MSAccess as front end and SQLServer as backend. have quite a bit of tables. i wanted to write a stored procedure which exports a SQL Server table (both Structure & Data) to a new Access MDB file. i know with the use of DTS its possible but i need to code it down. i need to perform this at runtime. so can anybody help. Its urgent.
In Access 97, I am trying to attach to a table in my SQL Server 2000db.I use File/Get Data, which displays all the SQL tables including theone I can't attach to. (I can attach to all the other SQL tables.)But, when I select that table I get a message that tells me the nameis invalid. I don't understand this because it gives me the name inthe Get Data dialog, and I don't have to enter it manually.Any ideas?Thanks,Bob C.
Multi user ASP.Net website, SQL Server backend.When using transactions to insert multiple rows into a table and then commiting them once you're happy that everything looks good, the table(s) gets locked before commiting.Basically, I want to perform updates and inserts under a transaction but still allow other users to carry on using the site and potentially the tables that are being used with the transaction.
Question: Is there a way to stop the table lock occurring and for SQL to just do a record lock?, thus avoiding blocking other users?I know that a possibility is to use Snapshot replication, but am a little worried about turning it on due to the small team we have for testing.
I have several DTS packages that connect to various Oracle databases. An upgrade has recently been done to one of the databases from 7.3 to 8i. The other databases were always 8i. Last week, I could edit data transer tasks normally, this week, DTS hangs and I have to use task manager to kill the process. It worked fine last week. I can successfully run the packages, I just can't edit them. I have no trouble editing or running packages that connect to databases other than the one recently upgraded. I have tried both OLE DB and ODBC connections with the same results. Does anyone have any ideas on how to fix this?
Hello. I want to ask about the possibility of copying both a tablestructure and it's contents from aSQL server table to a table within MS access. The problem cannot besolve with a permanent table structure at the target location.The names of the columns are essentially data with the application andso are subject to change. I am targeting a solution using SQL QueryManager.The approach I have tried (with failure) isSELECT *INTO <linkedserver table>FROM <local table>This should create and copy. However, I am not sure if this isachievable with this approach.Refer to the dialogue;-------------------------------------------------------USE MASTERGOEXEC sp_addlinkedserver@SERVER = 'Freddie',@PROVIDER = 'Microsoft.Jet.OLEDB.4.0',@SRVPRODUCT = 'OLE DB Provider for Jet',@DATASRC = 'C: empHMIS_Recipe.mdb'-- I am not sure if this is requiredEXEC sp_addlinkedsrvlogin 'Freddie', false, 'sa', 'Admin', NULLSELECT * FROM Freddie...FRED -- This is OKSELECT * INTO #Temp FROM Freddie...FRED -- This is OK-- This fails - Refer errorSELECT * INTO Freddie.FRED65from #tempServer: Msg 2760, Level 16, State 1, Line 1Specified owner name 'Freddie' either does not exist or you do not havepermission to use it.-- This also fails and I thought reflected the above select with naming- Refer errorSELECT * INTO Freddie...FRED65from #tempServer: Msg 117, Level 15, State 1, Line 2The object name 'Freddie...' contains more than the maximum number ofprefixes. The maximum is 2.EXEC sp_dropserver 'Freddie',@droplogins = 'droplogins'------------------------------------------------------------Thank you.Regards JC...