I am trying to execute three dts packages from a Job that I created. When creating the steps I am setting the Type To: Operating System Command(CmdExec).
The Command I am using is: dtsrun /S servername /E /dtsPackageName
For some reason whenever I try to run thes jobs it fails. Within this Job I have three steps. For each step I am using the same job but with a different package name. I have a trusted connection so my command should work just fine. What am I doing wrong. Can some please explain this process.
hello, i have a problem in my sql. im using a stored procedure and i want to get the newly created id number to be used to insert in the other table.it works like:insert into table() values()select @id = (newly created id number) from tableinsert into table1() values(@id) something like that.
SELECT Column_1, Column_2, Column_3, 10*Column_1 AS Column_4, 10*Column_2 AS Column_5, -- I am not being able to understand how to do this particular step Column_1*Column_5 As Column_6 FROM Table_1
First 3 Columns are available within the Original Table_1 The Column_4 and Column_5 have been created by me, by doing some Calculations related to the original columns.
Now, when I try to do FURTHER CALCULATION on these newly created columns, then SQL Server does not allows that.
I was hoping that I will be able to use the Newly Created Columns 4 and 5 within this same query to do further more calculations, but that does not seems to be the case, or am I doing something wrong here ?
If I have to create a new column by the name of Column_6, which is actually a multiplication of Original Column_1 and Newly Created Column_5 "I tried this - Column_1*Column_5 As Column_6", then what is the possible solution for me ?
I have tried to present my problem in the simplest possible manner. The actual query has many original columns from Table_1 and many Calculated columns that are created by me.And now I have to do various calculations that involve making use of both these type of columns.
I had DTS the Northwind sample database from SqlServer 2000 to 2005. It's went ok and no errors. Then, I created a SP named upGetCustomer, bascially it queries the Customers table and list some of it's fields and order by CustomerId,CustomerName, Country decrementally. SP is so simple and has no errors.
Then, I created a SqlServer project in VS2005 using C#. Add store procedure class as below, using System; using System.Data; using System.Data.SqlClient; using System.Data.SqlTypes; using Microsoft.SqlServer.Server; public partial class StoredProcedures { [Microsoft.SqlServer.Server.SqlProcedure] public static void Customers(string customerid) { using (SqlConnection sqlConn = new SqlConnection("context connection=true")) { sqlConn.Open(); SqlPipe sqlPipe = SqlContext.Pipe; SqlParameter param = new SqlParameter("@custId", SqlDbType.VarChar, 5); param.Value = customerid; SqlCommand sqlCmd = new SqlCommand(); sqlCmd.CommandType = CommandType.StoredProcedure; sqlCmd.CommandText = "upGetCustomer"; sqlCmd.Parameters.Add(param); SqlDataReader rdr = sqlCmd.ExecuteReader(); SqlContext.Pipe.Send(rdr); } }
I had taken method of created SQLCLR from source in Microsoft website, but coding is mine, and hopes it is correct. I used SqlConnection string as 'context connection=true' which I still not quite sure what does it means, hopes it mean current connection I am using on my Windows authentication.
The project did compiled & deployed OK on the VS2005 side.
The problem is that when I tried to locate the assembly on the Sql Server 2005, but can't find it anywhere on Sql Server.
I did try complied and deloyed again, it keeps saying there is an error in deployment as customers assembly is already exist on the database. I then, tried to remove this assembly on the database using SQL script, drop assembly customers in the Northwind database but got error saying it does not exist. So where is it???
SELECT Column_1, Column_2, Column_3, 10*Column_1 AS Column_4, 10*Column_2 AS Column_5,
-- I am not being able to understand how to do this particular step Column_1*Column_5 As Column_6
FROM Table_1
First 3 Columns are available within the Original Table_1
The Column_4 and Column_5 have been created by me, by doing some Calculations related to the original columns.
Now, when I try to do FURTHER CALCULATION on these newly created columns, then SQL Server does not allows that.
I was hoping that I will be able to use the Newly Created Columns 4 and 5 within this same query to do further more calculations, but that does not seems to be the case?
If I have to create a new column by the name of Column_6, which is actually a multiplication of Original Column_1 and Newly Created Column_5 "I tried this - Column_1*Column_5 As Column_6", then what is the possible solution?
I have tried to present my problem in the simplest possible manner. The actual query has many original columns from Table_1 and many Calculated columns that are created by me. And now I have to do various calculations that involve making use of both these type of columns.
I have a table that I have created a table and desire to do some basic math by adding a few new columns. The problem is that i cant get this to work without create many new select statements. The new columns that I wish to add refer to other newly created columns. Is there a way I can do this with CTW or subqueries? Unless it is a best practice to chain out the logic for the newly created columns
I have an example from AdventureWorksDW since the data is very accessible. I can safely create EMP_TENURE and PTO_REMAINING is this select statement. I would then need to create a new select statement to define 'BONUS' and then another select statement to define 'NEW_COL1' and so on.
Im still pretty new at SQL and am trying to learn how to complete such a task using subqueries or CTE.
Hi! I have a piece of code that connects to the master database and uses the simple 'CREATE DATABASE MYDATABASE' statement to create a new database on an sql server 2005 express instance.
I then close that connection and try to open a new connection using my newly created database. I use integrated authentication in both cases. However, I get the following error: Cannot open database "MYDATABASE" requested by the login. The login failed.Login failed for user 'DOMAINNAMEusername'.System.Data.SqlClient.SqlException: Cannot open database "MYDATABASE" requested by the login. The login failed.
I don't get this error all the time. I tried to run my code many times and it some times work. But in some cases it doesn't. The database is always created but some times I cannot connect to it. But again, if I wait a little and try again with the same database, the connection is made.
I know that my connections are opened and closed normally and they don't remain open. So I don't believe I reach the maximum number of available connections.
I also sometimes get this exception : A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.
I created a new database in SQL Server 2000, using the command "CREATE DATABASE ContactManager ON PRIMARY(NAME = ContactManager,FILENAME ='C:Program FilesMicrosoft SQL ServerMSSQLDataContactManager.mdf',SIZE = 5MB,MAXSIZE = 20MB,FILEGROWTH = 5MB)LOG ON(NAME = ContactManager,FILENAME = 'C:Program FilesMicrosoft SQL ServerMSSQLDataContactManager.ldf',SIZE = 2MB,MAXSIZE = 10MB,FILEGROWTH = 2MB)Go" Databse was successfully created and I created two tables also in it. Now when i try to create a new connection in Visual Web Developer 2005, I get the error message : "Directory lookup for the file "C:Program FilesMicrosoft SQL ServerMSSQLDataContactManager.mdf" failed with the operating system error 5 (Acess is denied.). Cannot attatch the file "C:Program FilesMicrosoft SQL ServerMSSQLDataContactManager.mdf" as 'ContactManager'."
I have many new tables for which i need to write Insert,Update and delete triggers manually. Is there any way to generate triggers script which takes table name as a input parameter and print/generate trigger's script?
Why are the user databases that were created in the MSSQLSERVER default instance showing up in the newly created ALPHAONE instance? I'm successfully logging into the alphaone database as it shows as "DASAlphaOne,1xxxPport) at the top of the treeview in ssms. I'm logging in as sa and can edit anything.
The issue here is that all the user databases are shown and can even be edited. I created this instance in an effort to hide databases from whoever is not supposed to see.I was expecting a clean instance with only the system databases..Is there something that can be set to keep each instance's databases private into itself?
We've been running a mirrored database (using certificates since we don't have a domain) and it's all working well. Last week we decided to add a witness for automatic failovers, but for some reason I just can not get the witness to connect to the Partner2 server.
See screenshot here
Please help me troubleshoot this - I re-created the endpoints / users / certificates but it's still not working. Where can I get more information on what exactly the problem is? Can I test the endpoints somehow?
I guess this should be rather simple to answer but i just don't get how to do this. I just want to import some data from an Access mdb file into a SQL Server 2005 Database.
As I dont want to create all tables in Sql Server 2005 beforehand i'd like to create them out of my SSIS Package. Therefore i use the 'Execute SQL Task' in the control flow before stepping into the dataflow where only a oledb-source and an oledb-destination exist...
VERY simple... My problem is, that i can't select the table from the selectionlist in the oledb-destination because it does not exist before executing the package...
If i create the table by hand just to be able to select it, that does not help. because if everything is set up and then i delete the table (because it will be created by the package anyway) an error occurs before executing the package - in the validation phase:
Package Validation Error: "Invalid object name 'myToBeCreatedTable'".
Can't i create tables on the fly which i use then in my dataflow tasks?
Kind regards,
Wolfgang
Hello dear Forum,
I guess this should be rather simple to answer but i just don't get how to do this. I just want to import some data from an Access mdb file into a SQL Server 2005 Database.
As I dont want to create all tables in Sql Server 2005 beforehand i'd like to create them out of my SSIS Package. Therefore i use the 'Execute SQL Task' in the control flow before stepping into the dataflow where only a oledb-source and an oledb-destination exist...
VERY simple... My problem is, that i can't select the table from the selectionlist in the oledb-destination because it does not exist before executing the package...
If i create the table by hand just to be able to select it, that does not help. because if everything is set up and then i delete the table (because it will be created by the package anyway) an error occurs before executing the package - in the validation phase:
Package Validation Error: "Invalid object name 'myToBeCreatedTable'".
Can't i create tables on the fly which i use then in my dataflow tasks?
I guess this should be rather simple to answer but i just don't get how to do this. I just want to import some data from an Access mdb file into a SQL Server 2005 Database.
As I dont want to create all tables in Sql Server 2005 beforehand i'd like to create them out of my SSIS Package. Therefore i use the 'Execute SQL Task' in the control flow before stepping into the dataflow where only a oledb-source and an oledb-destination exist...
VERY simple... My problem is, that i can't select the table from the selectionlist in the oledb-destination because it does not exist before executing the package...
If i create the table by hand just to be able to select it, that does not help. because if everything is set up and then i delete the table (because it will be created by the package anyway) an error occurs before executing the package - in the validation phase:
Package Validation Error: "Invalid object name 'myToBeCreatedTable'".
Can't i create tables on the fly which i use then in my dataflow tasks?
I am new to SQL Programming. I am learning the basics. I am trying to create a simple query like this -
SELECT Column_1, Column_2, Column_3, 10*Column_1 AS Column_4, 10*Column_2 AS Column_5,
-- I am not being able to understand how to do this particular step Column_1*Column_5 As Column_6
FROM Table_1 First 3 Columns are available within the Original Table_1 The Column_4 and Column_5 have been created by me, by doing some Calculations related to the original columns.
Now, when I try to do FURTHER CALCULATION on these newly created columns, then SQL Server does not allows that.
I was hoping that I will be able to use the Newly Created Columns 4 and 5 within this same query to do further more calculations, but that does not seems to be the case, or am I doing something wrong here ?
If I have to create a new column by the name of Column_6, which is actually a multiplication of Original Column_1 and Newly Created Column_5 "I tried this - Column_1*Column_5 As Column_6", then what is the possible solution for me ?
I have tried to present my problem in the simplest possible manner. The actual query has many original columns from Table_1 and many Calculated columns that are created by me. And now I have to do various calculations that involve making use of both these type of columns.
select col1, col2, col3, col4 from Table where col2=5 order by col1
I have a primary key on the column.The execution plan showing the clustered index scan cost 30% & sort cost 70%..When I run the query I got missing index hint on col2 with 95% impact.So I created the non clustered index on col2.The total executed time decreased by around 80ms but I didn't see any Index name that is using in the execution plan.After creating the index also I am seeing same execution plan
The execution plan showing the clustered index scan cost 30% & sort cost 70% but I can see the total time is reducing & Logical reads on that table is reducing.I am sure that index is useful but why there is no change in the execution plan?
I am getting ready to start a project where I am charged with moving out old data from production into a newly created historical DB. We have about 8 tables that are internal audit tables, that are big and full of old data. These tables are barely used and are taking up way too much space and time for maintenance.
I would like to create a way (SSIS?) to look at the date field in each of the 8 tables and copy out anything older than two years into my newly created history DB. Then deleting the older records from the source DB.
I don't know if SSIS is the best method to use. If it is, what containers to use to move over data, then how to do delete from source?
Can I do the mass deletes on my audit source tables without impacting performance/indexes/fragmentation?
I have a four tables called plandescription, plandetail and analysisdetail. The table plandescription has the columns DetailQuestionID which is the primary and identity column and a QuestionDescription column.
The table plandetail consists of the column PlanDetailID which the primary and identity column, DetailQuestionID which is the foreign key attribute of plandescription table and a planID column.
The third table analysisdetail consists of a analysisID which the primary and identity column, PlanDetailID which is the foreign key attribute of plandetail table and a scenario.
Below is the schema of the three tables
I have a two web form that will insert, update and delete data into these three tables in a two transaction. One web form will perform CRUD operations in plandescription and plandetail table. When the user inserts QuestionDescription and planid in this web form, I will insert the QuestionDescription Value in the plandescription table and will generate a DetailQuestionID value and this value is fed to the plandetail table with the planid. Here I will generate a PlanDetailID.
Once this transaction is done, I will show the second web form in which the user enters the scenario and this will be mapped with the plandescription using the PlanDetailID.
This schema cannot be changes as this is the client requirement. When I insert values I don’t have any problem. However when I update existing data, I need to delete existing PlanDetailID in the plandetail table and recreate PlanDetailID data for that DetailQuestionID and planID. This is because, the user will be adding or deleting a planID associated with the QuestionDescription.
Once I recreate PlanDetailID for that DetailQuestionID and planID, I need to update the old PlanDetailID with the new PlanDetailID in the third table analysisdetail for the associated analysisID.
I created a #Temp table called #DetailTable to insert the values analysisID, planid and old PlanDetailID and new PlanDetailID so that I can have them in update statement once I delete the data from plandetail table for that PlanDetailID.
Then I deleted the plandetailid from the plandetail table and recreate PlanDetailID for that DetailQuestionID. During my recreation I fetched the new PlanDetailID’s created into another temp table called #InsertedRows
After this I am running a while loop to update the temp table #DetailTable with the newly created PlanDetailID for the appropriate planID’s. The problem is here. When I have the same number of planID’s for example 2 planID’s 1,2 I will have only two old PlanDetailID and new PlanDetailID for that planID and analysisID.But When I add a new PlanID or remove a existing planID I am getting null value for that newly added or deleted planID. This is affecting my update statement of analysisdetail table as PlanDetailID cannot be null.
I tried to remove the Null value from the #DetailTable by running the update statement of analysis detail in a while loop however its not working.
DECLARE @categoryid INT = 8 DECLARE @DetailQuestionID INT = 1380 /*------- I need the query to run for the below three data. Here i'm updating my planids that already exists in my database*/ DECLARE @planids VARCHAR(MAX) = '2,4,5'
I am setting up SQL audit on sql servers in my environment based on requirement. I want to create database specifications ASAP database created. I tried DDL trigger but Audit doesn't support triggers. So I created audit specifications on model database. the only problem with this is every specification created on new database with same name.database specification name includes newly created database name or other methods to create database specifications on newly created databases.
I'm looking for a way to refer to a package variable within any Transact-SQL code included in either an Execute SQL or Execute T-SQL task. If this can be done, I need to know the technique to use - whether it's something similar to a parameter placeholder question mark or something else.
FYI - I've been able to successfully execute Transact-SQL statements within the Execute SQL task, so I don't think the Execute T-SQL task is even necessary for this purpose.
I have a master package, which executes child packages that are located on a SQL Server. The Child packages execute other child packages which are also located on the SQL server.
Everything works fine when I execute in process. But when I set the parameter in the mater package ExecutePackageTask to ExecuteOutOfProcess = True, I get the following error
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "Row Count" (5349).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Custom Split" (6399).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Data Source" (5100).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "DST_SCR Load Data" (6149).
The child packages all run fine when executed directly, and the master package runs fine if Execute Out of Process is False.
Below is a simplified table & dataset to illustrate a problem I'm experiencing with a more complex one.
Code Block
create table #test( recno smallint PRIMARY KEY, value decimal (18,2))
insert into #test values (1, 3.57) insert into #test values (2, 5.32) insert into #test values (3,6.29) insert into #test values (4, 9.25) insert into #test values (5, 0.84)
Method 1: I tried inserting rows from #test into a temp table (#table) as follows
Code Block
declare @n as nvarchar(3) set @n = 1 while @n <= (select count(recno) from #test) begin exec (' insert into ##table select *, originalrecno = (select recno from #test where recno = '+@n+') from #test' ) set @n = @n + 1 end However, this yields an error message:
Code Block
Server: Msg 208, Level 16, State 1, Line 2 Invalid object name '##table'. Note - you can comment out the insert into ##table line above to view the results that I'm trying to put into ##table.
Method 2: next I tried explicitly creating ##table & rerunning the loop containing the insert
declare @n as nvarchar(3) set @n = 1 while @n <= (select count(recno) from #test) begin exec (' insert into ##table select *, originalrecno = (select recno from #test where recno = '+@n+') from #test' ) set @n = @n + 1 end This worked - it inserted the data from the select statements in the loop into ##table.
I need the file created by my BCP QUERYOUT command I'm executing from XP_CmdShell in Transact-SQL to have a specific OWNER. I've tried using the sp_xp_cmdshell_proxy_account to setup the correct owner. The owner is always SERVERNAMEAdministrators.
I loaded the import/export wizard into my Tools menu. I ran it and exported a view to an excel worksheet with mixed results (I got the headings, but no data). I want to take a look at the package that should have been created (I selected that option, but the export didn't complete and I couldn't decipher the log to determine whether the package was created). If it was successfully created, where would I find it in SQL 2005 Express?
I have a for each loop that populates from a set of flat files into a Sql Server table, I run the Flat file Import via a dts package embedded into Execute DTS 2000 Task. I want to pass the Sourcefile Name that is fetched by the For Each Loop to assign it Global Variable in DTS. how this can be made ?
I have a SSIS job, one of the last steps it performs is to execute a SQL 2000 DTS package. This has to be done as a SQL 2000 DTS package as it is performing rebuilds of SQL 2000 Analysis Services dimensions and cubes. We've found that when the DTS fails the SSIS job is happily completing showing as a success, we would prefer to know it went wrong.
As far as I'm aware SSIS merely starts the DTS off and doesn't care about it's result. I've taken a look in to turning on the logging for the execute DTS package and thought that the ExecuteDTS80PackageTaskTaskResult would give me the answer I need...but is merely written to the log not available as an event-handler. It also looks like it is not safe to put a SQL task in as the next item to go look at the SQL 2000 system tables to look at the log for the DTS package as the SSIS documentation warns that the DTS package can continue to run after the execute DTS package task has ended.
Ideally I want any error raised within the DTS package to cascade up to be an error in the SSIS job, I can then handle it appropriately. I cannot find a way to do this. Is there a way?
If not, can anyone suggest how in the remainder of the SSIS tasks I can be sure that the DTS has completed before I start any other tasks that will check for the SQL 2000 log of its execution?
I am having problems executing a child package from a parent package using the Execute Package Task. I am attempting to run the master package through a SQL Server Agent job.
The SQL Server Agent job is owned by sa. The step that runs the parent package is configured to load the package from the SSIS Package Store on the same server that the job is running.
I have the Execute Package Task configured as follows:
Location: SQL Server ExecuteOutOfProcess: True Connecting as a SQL Server login (let's say TestEtl)
I have added the db_dtsoperator database role to both the TestEtl login and the login that SQL Server Agent connects through. I have also configured the child package's reader role to include db_dtsoperator. Per http://msdn2.microsoft.com/en-US/library/ms141053.aspx, this should allow these logins to run the child package.
I have enabled logging of all events in both the parent and child packages. I see the following in the logs when the Execute Package Task executes (omitted portions unrelated to the execution of the child package task):
450939 OnPreExecute ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450940 OnPreValidate ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450941 OnPostValidate ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450942 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_pre: The object is ready to make the following external request: 'IDataInitialize::GetDataSource'.450943 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_post: 'IDataInitialize::GetDataSource succeeded'. The external request has completed.450944 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_pre: The object is ready to make the following external request: 'IDBInitialize::Initialize'.450945 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_post: 'IDBInitialize::Initialize succeeded'. The external request has completed.450946 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_pre: The object is ready to make the following external request: 'IDBCreateSession::CreateSession'.450947 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_post: 'IDBCreateSession::CreateSession succeeded'. The external request has completed.450948 OnError ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 Error 0x80070005 while preparing to load the package. Access is denied. . 450949 OnError ParentPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 Error 0x80070005 while preparing to load the package. Access is denied. . 450950 OnTaskFailed ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450951 OnPostExecute ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450952 OnWarning ParentPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. 450953 OnPostExecute ParentPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450954 PackageEnd ParentPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 End of package execution.
I am sure that what I am doing is quite common, and I obviously have something misconfigured somewhere - but I'm not sure what my misconfiguration is. Can anyone enlighten me?
I now have two SSIS package, "TESTING" and "LOADING". The "TESTING" package have an execute package task that call the "LOADING" package. When I want to execute the TESTING package, how can I setup the connection string so that I can edit the password of the database connected by the "LOADING" package?
I have about 100 DTS packages and they were created by a user who has left the company. Now I( the new user)am trying to open the packages from another client machine and trying to save them back. I am connecting to SQL server where the packages are stored using NT authentication(domain account). When I try to save the package, I get the following error
Ole DB, Only the owner of the DTS package "packagename", or a member of the sysadmin role can create new versions of it".
What I have tried is that I can save the same package with a new name without being a member of the sysadmin role but I cannot save the package with the same name. ANy help will be greatly appreciated.
I was working on a flat file import spec yesterday, using the import wizard, and I saved my work as a SSIS package. Today, if I connect to Integration Services through SQL Management Studio, I can see the package, but I can't edit it. I've tried to open the package using BIDS, but I can't find it on the server. Obviously it exists, but where?
How the heck can I edit this package? And, where is it? I don't understand why things are so much more difficult in 2005 vs. 2000.