My english is not the best, but i will try my best.
I want to change the saving folder for the database from my systempartition to my datapartition (from c:labla to d:mydata), but i dont know how. I can see the folder but it is in gray font and unchangeable in the server configuration tool.
Can we change the name of the "Home" folder to something else? We have 15-20 instances of SSRS, and it would be real useful if I could display an instance name vs. "Home".
I dont alot about sql server 2005(Express edition). For debugging purposes i want to copy the whole app_data folder(.mdf & .log files) on the production server to another folder on the same machine(or sometimes to a network folder). So when i copy and try to paste this App_data folder to a new location, i get this error message "cannot copy ASPNETDB: it is being used by another person or program. close any programs that might be using the file and try again." After reading the above message, i close visual web developer, stop the website in IIS and stop the SQLExpress service on the server and try again but still get the same message. So how can i make sure that all the programs accessing these database files are closed such that i'm able able to copy them to a different location.
I know a WMI event watcher can be used to watch for a new file being added to a folder. However, I need to check for new folders being added to an existing folder. I haven't been able to find a post on doing this. Is there a way in WQL to check for a new folder being added instead of a new file? I've used SQL for years, but am new to SSIS.
I have a SharePoint (2007) Document Library site called B. Web Client is enabled on the server and B is mapped as a Drive (let's call it Y for this discussion)
I want to move documents in A to B. Easy enough, right? Not so....
I first started by creating a batch file that issues a COPY \A \Y /Y at the command prompt. Viola! Worked Great!
I then moved that command to a SQL Agent job as a CMDExec statement (exact same statement) and attempted to run it.....CRASH! It found the files in A but then said "The system cannot find the path specified"
Ok, so I tried it in SSIS. CRASH! Checked the error log. Same thing...
So I then checked the account under which the SQl Agent was running (special domain account for all our SQL Servers). Thinking it might mater I changed it to run under my name (I'm in Domain admin). I also ensured I had permissions to the SPS 2007 library as well. (I did).
Ran again! CRASH! Same error....
So, I created a batch file , placed thec ommand in the batch file and ran that from the command prompt! Viola! Worked Great.
So, I was thinking of how ingenious I was as I pasted my C:RootCopy.bat into my SQL Agent job. With a big grin on my face I right clicked and picked "Start Job at step".......CRASH! Same error.
Does anyone have any ideas on this ???????????????
I have a set of folders with the following structure:
MainFolder
1999
SpreadsheetA.xls
SpreadsheetB.xls
SpreadsheetC.xls
2000
SpreadsheetA.xls
SpreadsheetB.xls
SpreadsheetC.xls
etc.
Is there a way that I can use the foreach loop container to loop the sub folders? My plan was to get the folder name and path into a variable, use this to build the connection string for each file in this folder, carry out the upload for that folder, then move onto the next folder and do the same thing, but I cannot see a way to do this.
I am having problem with moving a file from one folder to another folder. Here is the detailed scenario:
I want to move a input.csv file from shared input folder to shared archive folder. i am using the below code to do this. declare @inpath varchar(100) SET @inpath = 'move "\abcdefINPUTinput*.csv" "\abcdefARCHIVEarchive.csv"' EXEC @filestatus = master..xp_cmdshell @inpath but the problem was it was cutting the input.csv file from INPUT folder but not pasting it in the ARCHIVE folder. I really appreciate if anyone can help me to solve this or anyone can tell some workarounds.
any suggestions on having one web config that once put on the dev servers uses the dev sql server and when put on prod will use the production sql server? would like to encrypt it and be done with it. but it needs to recognize the server it's on. I have a connection class that does this - but i need to use sqldatasource and not objectdatasource.
for the first time in my long SQL DBA live I see such a behaviours. My tempdb database is growing every damn second since a this morning. Now it reached 30Gb, the log file is empty (217 Mb).
We use SQL 2000 Ent on Win 2000 Advance Server. Running Siebel Call Center (7.5 ver) with about 300 users.
Some users time to time obtain and hold a huge amount Exclusive locks on the tempdb extents
I've got two SQL Server 2000 (SP ??) instances (on two separate machines; Win Server 2003 Standard) that I've inherited. I want to use one of them as a reporting instance of production for a single ~4GB database, updated nightly.
In other DBMS's I'd set up log shipping or a simple dump-and-load to keep the two in sync, but I'm not very familiar with SQL 2000 (I used to admin a SQL Server 7 back-in-the-day but have been on Sybase ASE, MySQL (blech) and 'Orable since).
Any suggestions to do this easily and (fairly) painlessly?
Would I want to set up replication between the two? If so, which flavor? -- To me, this seems a bit overkill. Plus I hate to muck with production unless I really need to
Would I use DTS to do this? -- Seems straightforward but as I understand it, DTS under-the-covers is a bcp-type process, which can be fairly slow.
Or a simple dump-and-load (with copy)? -- This seems the best option as we're already doing a nightly dump. However, the data will have to be shuffled off to the other server (or some sort of network share set up that it can access) and then a script fired off when the dump is complete. This seems the most "brittle" of the three options (if the dump hasn't finished yet, then the script copy and import will fail, etc.)
Surely this has been done over and over again (searching the archives didn't tell me anything, but the site search tool isn't that great).
I have recently become a release manager for SRSS in our company. Since then I've been swamped with requests to migrate reports, permissions and subscription lists from development environment to production.
Each time I have to do it manually with a lot of clicks. It is a real pain...
So, may be... may be there is an automation tool out there to help me? Does anybody know?
This tool or s/w package should move a report file along with its permissions and subscription lists from one server to another.
I have been using the index tuning wizard to review some of my stored procs,and views. So far most of my indexes have been set up well, but I am curiousas to how they would look under a production system load. I was thinking ofrunning a profile for about 30 minutes or so on the prod system, and thenusing that profile for the index tuning wizard to see what it says.Would this be of value?Can running a profile on a prod system be dangerous?--BV.WebPorgmaster - www.IHeartMyPond.comWork at Home, Save the Environment - www.amothersdream.com
I have a package that uses configurations to override package settings based on what environment the package runs in. The package's configuration entries begins with an initial XML config entry that overrides the package's connection manager to a SQL database that holds the remaining configs in a table. Subsequent config entries then fetch their settings from the table. This package is run from a SQL job.
This all works fine in dev. When I moved everything into prod the packages are not getting configured and are using their values stored in the DTSX files. I've triple checked the XML config file, the tables with the configs, and the packages. There are no error messages. I've added some debuging steps to the package to verify that the configs in the table are not getting into the package.
I've also tried manually changing the configs in the table where the package is set to look if the initial XML config fails to adjust the config database location. The package still fails to see any configs from the table.
What could be different between dev and prod that would produce this situation? Both dev and prod have identical copies of the package and the job and are currently pointed to the same configurations database.
By the way, the other connections in the package work for both source selects and destination inserts. Only the configurations are failing, and again there is no error message.
We have two different SQLServer 2005 databases, one for development and one for prod. I'm pretty new to DTS, and even newer to SSIS, and am working on converting a bunch of DTSs to SSIS.
Our DTS packages essentially were duplicated and edited for production because of the different server names, and some different directory names for sources and destinations. So the dev package would connect to the DBDEV database, and the prod to DBPROD, for example. When I create a package in dev and then copy it to prod, I have to go in and change all of the connections to now point to prod. There are also global variables pointing to various directories that need to be modified. Worse yet, there are also variables to directories set in ActiveX Script Tasks (which are deprecated and so need to be replaced). This is kludgy and error prone. So, since I'm learning SSIS and converting the packages, I would like to make them better.
What is a good way to specify the connections in a dynamic way? That way, when the packages are moved from dev to prod, the prod database can just be specified in one place. In other words, what are some best practices that I should know about? I'm reading this forum, and checking out links I find therein, but I also do better with specific examples (because I am so new to this).
Thank you for any guidance you can provide. -thursday's geek
For critical systems running SQL 2000 I've always believed the development, QA and production instances should all be the same edition of SQL Server. I didn't want to take a chance of something performing differently in development then in QA and production due to dev being Standard Edition and QA and production being Enterprise Edition.
For SQL Server 2005 would you agree with this approach? I'm only referring to critical systems. Non-critical I am willing to take the chance.
So, we are about 3 weeks away from going into production, and somehow we failed to give much thought to deploying our RS project into production.
We have over 110 report models that need to be deployed into production, and until now, we just deploy into our dev and test environments using Visual Studio. But, in our production environment, our deployers will not have Visual Studio.
Is there any simply backup/restore method that can be used to move our test environment into production? Please don't suggest a copy of each file one at a time /sigh.
I am migrating from local to Dev,QA and Prod. I created a .dtsconfig file containing database connection strings to Dev database. What is the "location" on the Dev server where this .dtsconfig file nees to be deployed to??
We're trying to run an EXE from SSIS through "Execute Process" task.
The EXE folder contains other DLLs as well.
The EXE interacts with the database and reporting services and sends some e-mails(max 500 a day) out to customers.
My question is: Is it ok to run this kind of EXE on the production SQL box? If not, why? (People argue that running EXEs is not advisable on production boxes)
Q: Why did Microsoft introduce "Execute Process" task when we cannot run EXEs on the production box?
If somebody can educate whether it's ok to run such EXEs on prod SQLs.
In either case, some explanation is greatly appreciated.
All queries on our Production database are timing out. Viewing the error log file the following show up over and over again:
Autogrow of file 'tempdev' in database 'tempdb' was cancelled by user or timed out after 3937 milliseconds. Use ALTER DATABASE to set a smaller FILEGROWTH value for this file or to explicitly set a new file size.
Autogrow of file 'Prod' in database 'Production' was cancelled by user or timed out after 33156 milliseconds. Use ALTER DATABASE to set a smaller FILEGROWTH value for this file or to explicitly set a new file size.
Our production database is about 1 gig in size with 3.5 million records. I tried setting the autogrow from 30% which it was before to 100MB, but no luck, still timing out and getting the errors above. Permission should be all good, nothing has changed.
There is about 50gigs of available disk space as well, so that's not the problem. Thanks for the help.
I'm having an issue to restoring database from prod to report server. I'm getting following error.
When I did Manually I got first error as below.
Msg 233, Level 20, State 0, Line 0 A transport-level error has occurred when sending the request to the server. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)
This is the second error
Msg 3044, Level 16, State 1, Line 37 Invalid zero-length device name. Reissue thestatement with a valid device name. Msg 3013, Level 16, State 1, Line 37 RESTORE DATABASE is terminating abnormally. Msg 5011, Level 14, State 5, Line 45 User does not have permission to alter database 'XeP', the database does not exist, or the database is not in a state that allows access checks. Msg 5069, Level 16, State 1, Line 45 ALTER DATABASE statement failed.
Created Prod order status report, in status, we have different status
created =0 start =4 released =3 reported as finished =5 ended =7
I have the report, in report don't want to show the Prod order for ended status, how can I add the filter for this so it can show for all the other status not ended status. when I did on filter <7 , it did not work
Hi all, I need one more help! we can select the dtsconfig file with the Environment variable(indirect configuration); but i need to select the configuration file at runtime; i've to load the package from the server and apply local configuration file to the package and run in a web server;
Requirements; 1. i have packages with its xml configurations for connection strings alone! 2. i deployed it on the server 3. Trying to execute the packages in a web page onclick event;
We have ASP.net WebPage; in On_Click Event, i have this code;
Application ap = new Application(); Package pk = ap.LoadFromSqlServer("\PROJECT", "itsssqldb", "pmo_package_user", "password", null); pk.ImportConfigurationFile(@"Packagesdev_staging.dtsConfig"); pk.Execute(); here PROJECT is the Package name; i'm loading the package from the Common Sequel Server 2005; and applying the configuration file to that package and trying to execute this; but its not using the config file what i mentioned; its returning failure!!
Can you please help me to resolve this issue!,
(The intention of doing this job is to select the configuration file( prod, dev) at runtime)
I just start my job as sql server DBA. We have disaster recovery plan. We implement Log shipping . My question after the server failiure, when production server up again ,how to point stand by server to production server?, any help will be appreciated.
i have created the folowing function but keep geting an error.
Only functions and extended stored procedures can be executed from within a function.
Why am i getting this error!
Create Function myDateAdd (@buildd nvarchar(4), @avdate as nvarchar(25)) Returns nvarchar(25) as Begin declare @ret nvarchar(25) declare @sqlval as nvarchar(3000)
set @sqlval = 'select ''@ret'' = max(realday) from ( select top '+ @buildd +' realday from v_caltable where realday >= '''+ @avdate +''' and prod = 1 )a'
We ran into weird/interesting issue with below details.
Version: Microsoft SQL Server 2012 (SP1) - 11.0.3000.0 (X64) Standard Edition (64-bit) on Windows NT 6.2 <X64> (Build 9200):
We are using SQLCMD to run DDL script on our product database in below order. That script has below content.
step # 1 - database collation change (case -sensitive) statement as very first statement of the script step # 2 - Actual DDL SQL statements step # 3 - database collation change back to original (case insensitive)
When we execute all above 3 steps in single script using SQLCMD on our test_server#1 , it is successful but when same is being implemented on test_ server#2 , it is failing.We ensured that there is no other user accessing the db and setting on both the server are all default/basic. Separating out all 3 steps in 3 different script working fine. This is only problem when we combine them into single script and fire it using SQLCMD. If it is something related to session/transaction then we should hit same issue on our test_server#1 server as well but that is not the case.test_server#1 and test_server#2 has exact same database/data, just two different physical machine & SQL Server instance.
I need to restore test DB from production backup but once it is restored I would need all the permissions of sql logins and windows AD account intact in test Db as it was before.