I am currently working on a stored procedure which will delete records from multi tables. I want to know testing approach of how I can test the same in the DEV environment.
Do I need to create all new tables copying the structure and constraints from the original tables or can use the original tables.
As the tables are used by other developers as well in the DEV environment thereby I am not sure if any important data is deleted during my unit testing.
We are in need of finalizing an archival approach for one of our Web and Client server application. The major requirements are
a) User can click on Web front end to start archival process.
b) The system should move the related data to archived space in a backup location.
c) Could be a batch process.
d) The relation between tables is extensive i.e. > 30 tables need to be managed for archival one component.
e) Database size is not very huge < 10 MB.
We were planning to have a table to store archival flag, which will be set when user click on Archival. Then a batch program will copy the database in to a backup location and delete the entries from archived database where archive flag is not set and delete entry from master database where archive flag is set. The problem is how to synchronize the changes when archival process runs next time i.e. the master database would have changed so how to put that data in archival database with out removing existing data.
Any other approach/practical solutions will be very helpful.
I am preparing to design an application that will archive files created by another application. In my SQL database I want to store details about the file and then the file its self. Each file is about 500kb in size and there will be about 20,000 files generated per year.
My preference is to store these files in a blob field. It makes storage, linking to file meta data, backup etc easy for me. I have already solved the technical issues surrounding pulling the file in and out of a blob field.
By my calculations I will need a server with 10GB of disk space for each year of files archived which doesn't seem outlandish for table size.
I do not want to design my application however to find out a year from now that I should have been storing these files in a traditional file system because of (... whatever ...) and just linking them by path in the sql database.
I'm curious what users of this forum believe to be the best practices surrounding this type of database?
Our OLAP environment involves an ETL/Data Warehouse/Data Mart server and a cube publisher server. We would like to learn how to automate the Archival/Restore of OLAP databases. We are currently doing it manually though OLAP Manager. Any help would be appreciated. Thanks. James.
-- James E. Bothamley Senior Database Administrator Dave & Buster's, Inc. 2481 Manana Dallas, TX 75220
Work Phone (214) 904-2296 email jbothaml@DaveAndBusters.Com
"Once in a while you can get shown the light in the strangest of places if you look at it right"
I was trying to extract data from the source server using OLEDB Source and SQL Server Destination when i encountered this error:
"Transaction (Process ID 135) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.".
What must be done so that even if the table being queried is locked, i wouldn't experience any deadlock?
Hello all, I am running into an interesting scenario on my desktop. I'm running developer edition on Windows XP Professional (9.00.3042.00 SP2 Developer Edition). OS is autopatched via corporate policy and I saw some patches go in last week. This machine is also a hand-me-down so I don't have a clean install of the databases on the machine but I am local admin.
So, starting last week after a forced remote reboot (also a policy) I noticed a few of the databases didn't start back up. I chalked it up to the hard shutdown and went along my merry way. Friday however I know I shut my machine down nicely and this morning when I booted up, I was in the same state I was last Wenesday. 7 of the 18 databases on my machine came up with
FCB:pen: Operating system error 32(The process cannot access the file because it is being used by another process.) occurred while creating or opening file 'C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf'. Diagnose and correct the operating system error, and retry the operation. and it also logs FCB:pen failed: Could not open file C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf for file number 1. OS error: 32(The process cannot access the file because it is being used by another process.).
I've caught references to the auto close feature being a possible culprit, no dice as the databases in question are set to False. Recovery mode varies on the databases from Simple to Full. If I cycle the SQL Server service, whatever transient issue it was having with those files is gone. As much as I'd love to disable the virus scanner, network security would not be amused. The data and log files appear to have the same permissions as unaffected database files. Nothing's set to read only or archive as I've caught on other forums as possible gremlins. I have sufficient disk space and the databases are set for unrestricted growth.
Any thoughts on what I could look at? If it was everything coming up in RECOVERY_PENDING it's make more sense to me than a hit or miss type of thing I'm experiencing now.
Dear list Im designing a package that uses Microsofts preplog.exe to prepare web log files to be imported into SQL Server
What Im trying to do is convert this cmd that works into an execute process task D:SSIS ProcessPrepweblogProcessLoad>preplog ex.log > out.log the above dos cmd works 100%
However when I use the Execute Process Task I get this error [Execute Process Task] Error: In Executing "D:SSIS ProcessPrepweblogProcessLoadpreplog.exe" "" at "D:SSIS ProcessPrepweblogProcessLoad", The process exit code was "-1" while the expected was "0".
There are two package varaibles User::gsPreplogInput = ex.log User::gsPreplogOutput = out.log
How do I use the execute process task? I am trying to unzip the file using the freeware PZUnzip.exe and I tried to place the entire command in a batch file and specified the working directory as the location of the batch file, but the task fails with the error:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC0029151 at Unzip download file, Execute Process Task: In Executing "C:ETLPOSDataIngramWeeklyUnzip.bat" "" at "C:ETLPOSDataIngramWeekly", The process exit code was "1" while the expected was "0".
Then I tried to specify the exe directly in the Executable property and the agruments as the location of the zip file and the directory to unzip the files in, but this time it fails with the following message:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC002F304 at Unzip download file, Execute Process Task: An error occurred with the following error message: "%1 is not a valid Win32 application".
The command in the batch file when run from the command line works perfectly and unzips the file, so there is absolutely no problem with the command, I believe it is just the set up of the variables on the execute process task editor under Process. Any input on resolving this will be much appreciated.
I am designing a utility which will keep two similar databases in sync. In other words, copying the new data from db1 to db2 and updating the old data from db1 to db2.
For this I am making use of the 'Tablediff' utility which when provided with server name, database, table info will generate .sql file which can be used to keep the target table in sync with the source table.
I am using the Execute Process Task and the process parameters I am providing are:
The customer.bat file will have the following code: tablediff -sourceserver "LV-SQL5" -sourcedatabase "TC_CTI" -sourcetable "CUSTOMER_1" -destinationserver "LV-SQL2" -destinationdatabase "TC_CTI" -destinationtable "CUSTOMER" -f "c:SQL_bat_Filessql5TC_CTIsql_filescustomer1"
the .sql file will be generated at: C:SQL_bat_Filessql5TC_CTIsql_filescustomer1.
The Problem: The Execute Process Task is working fine, ie., the tables are being compared correctly and the .SQL file is being generated as desired. But the task as such is reporting faliure with the following error :
[Execute Process Task] Error: In Executing "C:SQL_bat_FilesSQL5TC_CTIpackage_occurrence.bat" "" at "C:Program Files (x86)Microsoft SQL Server90COM", The process exit code was "2" while the expected was "0". ]
Some of you may suggest to just set the ForceExecutionResult = Success (infact this is what I am doing now just to get the program working), but, this is not what I desire.
I'm pulling data from Oracle db and load into MS-SQL 2008.For my data type checks during the data load process, what are options to ensure that the data being processed wouldn't fail. such that I can verify first in-hand with the target type of data and then if its valid format load it into destination table else mark it with error flag and push into errors table... All this at the row level.One way I can think of is to load into a staging table then get the source & destination table -column data types, compare them and proceed.
should I just try loading the data directly and if it fails try trouble shooting(which could be a difficult task as I wouldn't know what caused error...)
I am having this table locking issue that I need to start paying attention to as its getting more frequent.
The problem is that the data in the tables is live finance data that needs to be changed and viewed almost real time so what I have picked up so far is that using 'table Hints' may not be a good idea.
I have a guy at work telling me that introducing a data access layer is the only way to solve this, I am not convinced but havnt enough knowledge to back my own feeling up. (asp system not .net).
Hi, I'm trying to upload the ASPNETDB.MDF file to a hosting server via FTP, and everytime when it was uploaded half way(40% or 50%) I would get an error message saying: "550 ASPNETDB.MDF: The process cannot access the file because it is being used by another process" and then the upload failed. I'm using SQL Express. Does anybody know what's the cause? Thanks a lot
Hi. When I try to start a package manually clicking the Start Debugging button I get this after a little while:
Cannot process request because the process (3880) has exited. (Microsoft.DataTransformationServices.VsIntegration)
How can I prevent this from happening? This happens every time I want to start the package and every time the process id is different. Here it is 3880.
I have had a full lock on my sql server and I have a few logs to found the origin of the lock.
I know the process at the head of the lock is the 55 process.
Here are the information I have on this process: Spid 55 55 ecid 5 5 Ecid 0 0 ObjId 0 1784601646 IndId 0 0 Type DB PAG Resource 1:1976242 Mode S IS Status TransID GRANT GRANT TransID 0 16980 TransUOW 00000000-0000-0000-0000-000000000000 00000000-0000-0000-0000-000000000000
lastwaittype PAGEIOLATCH_SH CMD AWAITING COMMAND Physycal id 1059 Login time 2007-07-05 04:29:53.873 nat address DFF06EBF974D Wait type 0x0046 HostName . BlkBy . DBName grpprddb CPUTime 54331 DiskIO 1059 ProgramName
Would someone know a way to identify the origin of the process 55?
I have already tried to execute the following request: select * from SYSOBJECTS where id=1784601646
I have a File System Task Copy file operation to copy a file in an SSIS package. The package when scheduled as a job fails with the following error:
The process cannot access the file 'C:ETLConsignmentAppleAppleRawFile.txt' because it is being used by another process.".
However when I right click on the package and execute it manually from the Integration Services it runs successfully without any problem. I am not certain on how to resolve this issue any inputs will be much appreciated.
Error: 0xC002F304 at Rename file 1, File System Task: An error occurred with the following error message: "The process cannot access the file because it is being used by another process.".
When running two File System Tasks after each other, with the same file, the file is still locked when running the second task. Resulting in an error: 0xC002F304 at Rename file 1, File System Task: An error occurred with the following error message: "The process cannot access the file because it is being used by another process.".
I found a workaround by addind a Execute Process Task before the second File System Task that pings to the localhost. This results in a 5 second delay, but there must be a better solution. Anyone?
i'm going nuts with SQL server notification thing. I have gone throigh this artical which tells how to set user http://www.codeproject.com/KB/database/SqlDependencyPermissions.aspx. This article show how to create new user and setup for sql server notification.But In my case user was alredy existing in database. which is very common senario in most cases. So i did following( check the SQL script below) but then i get this error "A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)" this my sql script use [master]Go -- Ensuring that Service Broker is enabled ALTER DATABASE [DatabaseName] SET ENABLE_BROKERGO -- Switching to our databaseuse [DatabaseName]GO CREATE SCHEMA schemaname AUTHORIZATION usernameGO ALTER USER username WITH DEFAULT_SCHEMA = schemaname GO /* * Creating two new roles. We're not going to set the necessary permissions * on the user-accounts, but we're going to set them on these two new roles. * At the end of this script, we're simply going to make our two users * members of these roles. */EXEC sp_addrole 'sql_dependency_subscriber' EXEC sp_addrole 'sql_dependency_starter' -- Permissions needed for [sql_dependency_starter]GRANT CREATE PROCEDURE to [sql_dependency_starter] GRANT CREATE QUEUE to [sql_dependency_starter]GRANT CREATE SERVICE to [sql_dependency_starter]GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification] to [sql_dependency_starter] GRANT VIEW DEFINITION TO [sql_dependency_starter] -- Permissions needed for [sql_dependency_subscriber] GRANT SELECT to [sql_dependency_subscriber] GRANT SUBSCRIBE QUERY NOTIFICATIONS TO [sql_dependency_subscriber] GRANT RECEIVE ON QueryNotificationErrorsQueue TO [sql_dependency_subscriber] GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification] to [sql_dependency_subscriber] -- Making sure that my users are member of the correct role.EXEC sp_addrolemember 'sql_dependency_starter', 'username'EXEC sp_addrolemember 'sql_dependency_subscriber', 'username'
I am using SqlServer 2005 Express Edition. I have written a stored procedure that has a transaction to make sure that it rolls back in case of failure. Is there a way to test that the transaction works. I don't now how to cause an error that would cause the transaction to roll back and raise the error with the details of the exception. Thanks, laura
hello everyone i was hoping to get some help. i have sql server2000 and iis5.1 running and im using vbscript through asp. i have a table that has two keys so that together they make a unique key. so i am trying to test my table in asp so that if there is a duplicate unique key inserted then show an error... Can anyone help me?
I want to run remote procs from a local - central proc, but don't want to crash my proc if the remote NT Server or SQL Server is not available. So, I want to test for NT and SQL availability first.
/* I find the server-dbasename name in a lookup table, and then build dynamic script */ SELECT @DBName = "<SRVRNAME~DBASENAME>" SELECT @QueryString = N'master..xp_cmdshell "ECHO >' + SUBSTRING(@DBName, 1,6) + 'pipesqlquery", no_output' EXEC @Results = sp_executesql @QueryString
SELECT @Results /* <-- testing here.... */
=====================
If the NT / SQL is running, "@Results" should = 0 (success on the ECHO to the PIPE) If the NT / SQL is NOT running, "@Results" should = 1 (error on the ECHO to the PIPE)
If I test in QA, I get the expected "@Results" values.
If I run the proc, I get "@Results" = 0 for both events, and them my proc fails on the remote proc call if the NT / SQL is NOT running.
I don't know how to write DMO in TSQL to try that. I have also tried using 'odbcping' too.
HINTS? Better way to test remote NT / SQL is NOT running?
i have performed all the steps for the replication there are two problems i encounytring (i) how to check whether the replication is working or not (ii) can replication on the subscribing server automatically create the table which publishing server is publshing (iii)please tell me someother forums for sql server
what my knowledge says that in replication if i modify the data at one place it should automatically be modified in the subscribing server but this thing is also not happening (its not flashing any error message as such) please tell me what all things i need to do to make sure that everything starts working
I'm new to this forum and as well as new to OLAP testing . I would like to know OLAP testing and guidance for SQL server anlaysis services .
I have been trying install SQL server 2000 in my system but it not going through, Can anybody send information how to install SQL server and how to play with sample database.
Has anyone ever heard of using Response.Write to test sql queries in the query designer? Here is the actual suggestion I was given recently:
Response.write your sql query. Copy the query to your db query tool and run the query. Adjust the query as necessary to return the recordset you desire. When you have the proper query, adjust your asp code.
The query editor/designer doesn't like certain symbols and extraneous code. So how exactly can you use Response.Write with a sql query without getting errors in designer?
I've begun some testing with the June beta of SQL 2005. One problemI've hit is with scalar-valued functions. The error I often get whenexecuting functions is "Select statements included within a functioncannot return data to a client.". These same functions are workingfine under SQL 2000.Has anybody seen this behavior or know what the deal is?
I've started researching on Unit Testing and I must admitI had never heard of Unit Testing until a couple of monthsago. Obviously I am interested in Unit Testing StoredProcedures.I read the TSQLUnit documentation (not all of it) and i also raninto a newsgroup post saying TSQLUnit is very small comparedto NUnit. The conclusion I am making out of this post is thatI should rather spend time resarching/reading about NUnit thanTSQLUnit. Is that a good assessment?I would like to you what you use and if you use actuallyUnit Testing or some other method? I ran into White Box/Black BoxQA testing. All these are new to me. Any good place to read about"Extreme Programming"? I ran into one link that I saved it at work.That's one place i will read more.Any links, documentation or books you would suggest?I searched Amazon and I didn't find anything interestingregarding SQL Server and Stored Procedures.Thank you