SQL 7 To 2000...how To Transfer User Id's, Jobs, And DTS Packages
Feb 11, 2003
Hello,
I want to move a db from a SQL server 7 to SQL Server 2K. I'll do a backup and restore function to move the database, but how do I handle the objects mentioned in the header ?
Bothe servers (old and new) will have the same name, as well as the DB
We've upgraded our SQL7.0 user db to a SQL2000 system using sp_detach_db and sp_attach_db stored procedures. And we'd like to transfer our sched jobs from a SQL 7.0 to SQL2000 systems, if the jobs are created outside a maintenance plan, we can script them out and run the script to recreate them on the 2000 system. But if the jobs are within a maintenance plan, we would have to recreate these plans manually on the 2000 system, is this the only option we have ? Any one has a better solution ? I think "restore msdb from abackup" should work better in this case?
In SQL2000, there is a new DTS task for transferring jobs between 2000 systems (same version), but I don't think it work to transfer them from 7.0 to 2000 ?? Anthony
I want to grant a user permissions to run Jobs in SQL Server 2000. I can' seem to find any appropriate roles for that. Do I need to assign him a server role? In that case, which? Many thanks.
Is there any way that I can script out all the jobs at a time (instead of doing one by one). I have mor than 100 jobs to transfer, I was just wondering if there were a way to do all at once.
I have diferent jobs scheduled in the system but I can´t find which Package is launched by the job. I only have this information: (double click over the job, steb tab, modify button for any job step, in the command text window)
Today when I came into work, I noticed all of my jobs failed on a particular server. I tried to manually kick off the packages, but to no avail. Both the job history and the error message from the packages state a timeout error. Any idea on this? Also, I have no problem running a query from query analyzer...even on a remote machine. Any help would be appreciated Trevor
Hi boys and girls, I am a total newbie to ms sql. So please be gentle.
I have a ms sql database on one server but I need a duplicate of the ms sql database on another server with all packages and jobs duplicated on the new server.
I was able to export the database over to the new database on the new server no problems there.
The problem is under DTS I have "local packages" that all need to be moved to the new server. I am able to export them into my computer. But on the new server there is no import for the local packages. How do I get these packages onto the new server into my server's ms sql???
Also under the management and sql agent manager, there are jobs that are on the old db, that I need to go to the new server in ms sql.
I am trying to transfer all jobs from one instance to another by using data tools. However, once i tried to make smo connection i am getting this error;
A network-related or instance-specific error occurred while establishing a connection to SQL Server.
In order to solve this issue i have tried these solutions;
1. SQL Server should be up and running. (OK) 2. Enable TCP/IP in SQL Server Configuration (OK) 3. Open Port in Windows Firewall (FW ACCEPTS ALL LOCAL PORTS) 4. Enable Remote Connection (CHECK OUT THE sp_configure SETTINGS, i even right-click instance then see from properties ) 5. Enable SQL Server Browser Service (sql server browser has been restarted) 6. Create exception of sqlbrowser.exe in Firewall (FW ACCEPTS ALL PROGRAMS) 7. I tried windows and sql authentication which has sysadmin role 8. INSTANCE name is also chekced millions of times
I'm having an issue in SQL Server 2005 with jobs that execute SSIS packages. The jobs run fine for a week or so, then I'll come to find that four or five (of the ten or so jobs) are hung in "executing" status. They seem to hang indefinitely (as some have been "executing" for hours with no end. The schedules of the hung jobs are all different, varying from every 10 minutes to nightly. The packages perform completely diffent tasks, as well. I can't seem to find any common thread with the jobs that get hung, other than they are all executing SSIS packages.
I've tried manually stopping the jobs and restarting the agent and SQL Server but the jobs hang again on there next scheduled run. The only thing that fixes the issue is rebooting the box, and then the jobs hang again in a week or so. Could some sort of memory leak be consuming resources throughout the week and be causing the jobs to eventually hang? I just rebooted the box and the sqlagent90.exe process is currently using about 7mb of memory. I'll keep an eye on it. Any other suggestions?
I've thought of creating another job that stops jobs that are hung, but what's to say that this job won't get hung as well? Plus this seems like a band-aid fix...
I don't recall having these problems until installing SQL Server 2005 SP2. Could this be related? I've searched like crazy and still can't find a resolution to this. It's becoming a big PITA...
Anyway, any suggestions would be very much appreciated!
We will migrate our SQL 7.0 databases into SQL 2000 on another server. Since I have hundreds of DTS packages on my Repository, is there a better way of transferring to the other server's Repository then ... Package -> Save As -> choose server and location etc,etc...? Thanks, Suat.
I have deployed to production a number of nested packages (parent packages that call child packages) to the SQL msdb via the Save As option rather than building a deployment utility. These packages reference configuration files in a static location off of the c: drive on the production server. In the development environment, when connection changes are made and I run the Reload with Upgrade option the connection manager takes on the new server and user id settings. However, out on the production side I get the following error from the SQL job log:
Cannot load the XML configuration file. The XML configuration file may be malformed or not valid.
As a result the SQL job uses the default connection information which references the development database rather than the production database. I did research the error but found no good solutions. Is there a way to ensure the configuration files are formed correctly and that the packages are correctly referencing the configuration files? We are trying to run the ETL updates via a SQL job.
I have a user who needs the text SQL commands she put into a job on an old server last year. The job does not have to run, she just needs the many SQL commands she used in every step of the job. Sorry if this is a dumb question but... Can I get this back by restoring the old vintage master database into a new unused SQL server (created just for this).
Hi All; I know that in the SQL Server, the logins are not contained in the database (thanks to Kevin Yu) but I don't know where are they stored. For example if I want to transfer to another server I'll have to backup users too; but HOW? In order to be more clear: When I simply take a backup from my server and restore it to my computer I can not login again with the user login information which is created on the server; However even if I make some changes; after backing the DB up from my computer and restoring to the server the logins are working (with re-mapping) Thanks...
Hello. I am using SQL Server Management Studio (SQL 2005) and created a daily backup job. Inside the job, i have an "Operating system" step to copy backuped up files onto another directory. However, the job kept on failing with an error "Executed as user ... Access is denied." With this error, how can i change the user used to execute the job? Thanks.
From the microsoft technet How to: run a package using a SQL server agent job (http://technet.microsoft.com/en-us/library/ms139805.aspx):
Click the Set Values tab to map properties and variables to values.
Note:
The property path uses this syntax: Package<container name>.<property name>. Depending on the package structure, a container may include other containers, in which case nested containers are separated by a back slash (). For example, PackageMyForeachLoopMySequenceMyExecuteSQLTask.Description.
If this is the syntax for writing the container.property value, how do you write the variables?
I have a user who has created several SQL7 databases and uses a VB app which schedules jobs (under the sa account) on the SQL server. How can I allow this user to view scheduled jobs on the server ? I don't want to give him too many priviledges.
I have an web application where the users has to run SQL scheduled jobs from the webpage. How to assign permission to a specific user to run specific jobs without making them a member of a Sysadmin role?
Any ideas you all smart people? Thanks in advance!
I have installed SQL server 2000 (Enterprise) on W2K advanced server using my local "Administrator" account. Using the same account for login, I have created database backup jobs on the server. After this I rename the Windows administrator account to "admin". Now I see that the jobs started failing. The error message indicates that "could not determine whether the 'server/administrator' account has access to SQL server."
Further, when I tried to troubleshoot, I discovered that the job owner name still displayed (through EM) is "server/administrator" account (this account is now renamed). I tried to change the owner name to "server/admin" but it is not getting changed. Hence, I just created a Windows user named "Administrator" and see that the Job is executing successfully.
Please note that the backup is taken on the server itself and SQL server & Agent services are running under security context of a domain account having administrator priveleges on the server.
What I am interested in knowing is what are the reasons for such behaviour? And how to rectify this problem since i really don't want the "administrator" account to be created.
Hi. i'm new to sql server jobs as i have never used the feature manually before. here is what i want to do but i don't know if it's possible.
we have a report we use that's accessed from an asp page. currently, the page uses a stored procedure to generate the data. well, as time has progressed, we've added many items to the report. it now takes a good 15-20 seconds to pull the report. what i want to do is run a job that will compile the data for me and just dump that information into a new table. does that make sense?
Hi, I have been scheduling jobs from past couple months on production but it does not execute on the scheduled time.
I checked the schedule enabled tab, it was enabled when i scheduled the job, but was disabled at a later date when i checked to see why the job did not run.......
Can some one please guide me if there is any process in SQL which disables the jobs automatically.
I want to find out how the jobs when scheduled and enabled gets disabled on its own.
Yao, Could anyone help me with, or point me to information regarding, scheduling jobs for MSSQL 2000? I need to execute a SQL Command, monthly, which sets a budget field to a fixed ammount, something like: UPDATE Programs SET Budget = 500. --Kyle Johnson
I am trying to migrate few of my SQL jobs to SQL2K5. Can I simply script and execute them on the new environment. I hope the script is forward compatible.
------------------------ I think, therefore I am - Rene Descartes
I have a package that first needs to copy some files from a workstation computer up to a server. If I set the package execution up as a job, the packages executes under a domain account (<domain><sql agent>). This domain account does not have access rights to the directory on the workstation, therefore the package can not see or copy the files in the directory. I have tried to execute the package from the command line using 'RUNAS' with 'dtexec.exe' (runas /user:<domain><user> detexec.exe ...) using my own user name and apparently, because you have to wait for the password prompt, this doesn't work either.
So, is there a way for me to set up a package to run under a specific set of credentials so that I am able to copy the files from the workstation up to the server? I realize that just setting up the current <sql agent> user with the proper credentials would be the easiest, but our network admin group won't hear of it. They want one user id that SQL Server, SSIS, SSAS and SSRS run under (the <sql agent> account) and another user id that has the file system access that I described above.
Any help would be much appreciated.
Thank you.
Wayne E. Pfeffer Sr. Systems Analyst Hutchinson Technology Inc.
I have a log file that went out of control (147GB), I feel that doing a shrink on the file/database would be time consuming so I am thinking of recreating the database from an imported version from our test server.I know that data can flow from one SQL server to another without regard to the SQL server version but can tables, stored procedures and views export from a 2000 database and put back into a 7.0 database?Thanks--Message posted via http://www.sqlmonster.com
Hello I have a production database that i need to refresh to our test environment daily. The database size is 700 MB. I do not need to transfer the stored procedures and triggers , users and logins. Would a DTS package that runs every night be the best and the easiest solution to implement or should i look into log shipping and snapshot replication.
I would like to create an SSIS package in 2005 and run it in 2000. Is there anyway to do this? Or does SQL Server 2000 have a precursor to SSIS? I am trying to create a job to automatically catch and kill orphaned processes. -Kyle
I'm working with SQL 2000 and am just learning about Maintenance Plans (MP). They seem convenient, but after some time, I'm wondering if they're the best approach long-term. Here are my experiences.
Using the MP Wizard, I created a plan with tasks from all the dialogs:
I was puzzled to find 4 jobs were created, each with just 1 step, and staggered starting times. I expected to find 1 job with 4 steps. So, brimming with confidence, I did just that. I combined all 4 into 1 job, deleted the 3 other MP created jobs, and checked for any job-specific details in the code. However now when I open the MP, I get this pop-up:
"One or more of the jobs created for this plan has had additional steps added to it. It is not recommended that jobs created by the maintenance plan be modified in any way."
Okay, fair warning. Yet it appears the job and all steps run successfully, both on demand, and on a schedule. So now I'm wondering if jobs always need a MP. If I don't mind working with xp_sqlmaint syntax, it appears the only thing I'm giving up is the MP history. But I expect job history and '-WriteHistory' will minimize that loss.
I searched BOL, this Forum, and Google, and found a couple articles. One author preferred the ease of the Wizard, another preferred the control and added features of T-SQL, but both created a MP in their examples. So I'm hoping some experienced DBAs can advise.
If I create a job with multiple steps, and no MP, are there important things I give up or problems I create? Is this approach a bad idea in SQL 2005?
At this stage, I don't need replication or other advanced features. Just simple database maintenance.