I have the following error when I try to execute a DTS package. All the package is doing, is a bit of WScript to map a drive (need it for other packages), so:
[code]
Function Main() Dim WshNetwork Set WshNetwork = WScript.CreateObject("WScript.Network") WshNetwork.RemoveNetworkDrive "S:" WshNetwork.MapNetworkDrive "S:", "\myserverfolder1folder2folder3folder4" Main = DTSTaskExecResult_Success End Function
[/code]
If I copy it out to a .vbs file and execute it logged in as the SQL Agent profile, it works. Execute the package under a job, or just execute it, it fails with the above error. Any ideas?
I built a small package two years ago that uses Flat File Sources to copy in small text data files. Each source connection object has a UNC path to flat text files on another server. The source system changed, so I opened the package and updated the UNC path in one Connection Manager object, and clicked OK. The Flat File Source Editor that uses this source seemed to be able to see the new location when I clicked "Preview". Then I went back to the file source, and the connection had reverted back to the original one. it would not save the new UNC path.
I am using SQL Server 2012 SP2 with SSDT (run as admin). I closed the package in SSDT, edited the connection strings using XMLnotepad, and was then able to open, test, build and deploy the package.
It seems that the Source object will not let itself be changed. The other option is to delete it and recreate it, but I didn't want to remap the fields.
I hope this is a simple question. I have a package-scope user variable which is populated using a Recordset Destination in a Data Flow task. I am attempting to read the variable multiple times from different Script Tasks. The first read works fine, however the second read, in the second Script Task, says that there are no rows.
Has anyone run across this before? Any thoughts would be appreciated.
Hi There,This is related to a ms access database but since I use the SqlDataSource control I thought I should post here.I have a project that I was working on with this ms access db and using sql controls, everything was working just finesince one day I started getting "Object reference not set to an instance of an object" messages when I try to designa query or retrieve a schema, nothing works at design time anymore but at runtime everything is perfect, its a lotof work for me now to create columns,schemas and everything manually, I've tried reinstalling visualstudio, ado componentsbut nothing seems to fix it, did this ever happen to any of you guys?any tip is really appreciated thanks a lot
Each time I press submit to insert data into the database I receive the following message. I use the same code on another page and it works fine. Here is the error:
Object reference not set to an instance of an object. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.NullReferenceException: Object reference not set to an instance of an object.
Source Error:
Line 125: MyCommand.Parameters("@Balance").Value = txtBalance.Text Line 126: Line 127: MyCommand.Connection.Open() Line 128: Line 129: Try
[NullReferenceException: Object reference not set to an instance of an object.] CreditRepair.CreditRepair.Vb.Creditor_Default.btnSaveAdd_Click(Object sender, EventArgs e) in c:inetpubwwwrootCreditRepairCreditor_Default.aspx.vb:127 System.Web.UI.WebControls.Button.OnClick(EventArgs e) System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) System.Web.UI.Page.ProcessRequestMain()
Private Sub btnSave_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles btnSave.Click
If (Page.IsValid) Then
Dim DS As DataSet Dim MyCommand As SqlCommand
Dim AddAccount As String = "insert into AccountDetails (Account_ID, Report_ID, Balance) values (@Account_ID, @Report_ID, @Balance)"
MyCommand = New SqlCommand(AddAccount, MyConnection)
is there a step by step paper to get there? here is what i need to consider. I Iwill have many customers that will need their own set of records and access pages "branded for their company" each customer will have many clients. I am hosting this application on a windows 2003 server with SQL 2005 server enterprise.
I am using windows authentication, I have created a username in windows, then i added the windows user in SQL management studio in security, granted "DB Read" and "DB write" and again under the database security tab. still from the web authentication fails. i must be nissing a step or two?
I expect to set up a username for each database as i setup new customers.
I am using SQL Server report 2008/2012 (SSRS) and my report viewer contains body content with 3 Row groups. While printing the report, data print with blank space and move to continue data to next page.
Departure flight : 70 rows First Page : 42 rows printed Second Page : 23 rows printed [ Supposed to be print 28 , if the total count of records more than 23 and less than 42 then the page print only 23 records ] Third Page : 5 rows printed
Departure flight : 42 rows First Page : 42 rows printed [Report max. record allowed to print 42 rows so if total record is 42 then print perfectly ]
Departure flight : 26 rows First Page : 23 rows printed [Supposed to be print 26, if the total count of records more than 23 and less than 42 then the page print only 23 records ] Second Page : 3 rows printed
The ERP manufacturer used an image data type to store large text data fields. I am trying to move these data types from one database to another database using either Sql Queries or MS Access. I can cast them as an 8000 char varchar to read them directly but have no luck importing into these image data fields.
Access and Crystal are not able to read these fields directly.
Any suggestions? Most information about these fields has to do with loading files but I am just moving data.
I did not see a forum for the SQL Server 2000 DTS.
I have a flat file feeding a table via a data pump. The table is only used by this process. It will run for about 30minutes and then fail. The message in the history does not give any detail on why it is failing. Below is the message I get and if I rerun the job it works fine. Anyone help me please.
Date 07/23/2007 6:00:02 AM Log Job History (Daily: Load EOL from MVS1 (First Run))
Step ID 1 Server PIT-CS-M608 Job Name Daily: Load EOL from MVS1 (First Run) Step Name Daily: Load tblCaseMasterSched Duration 00:28:05 Sql Severity 0 Sql Message ID 0 Operator Emailed Operator Net sent Operator Paged Retries Attempted 0
Message Executed as user: PIT-CS-M608SYSTEM. ...rt: DTSStep_DTSActiveScriptTask_1 DTSRun OnFinish: DTSStep_DTSActiveScriptTask_1 DTSRun OnStart: DTSStep_DTSExecuteSQLTask_1 DTSRun OnFinish: DTSStep_DTSExecuteSQLTask_1 DTSRun OnStart: DTSStep_DTSDataPumpTask_1 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 1000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 1000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 2000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 2000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 3000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 3000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 4000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 4000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 5000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 5000 DTSRun OnProgress: DTSStep_DTSDataP... Process Exit Code 1. The step failed.
I am transferring data from one SQL table to another. The first table has a PK on the unique id only, the second table has PK on five fields (the idea being to reject duplicate records etc. etc.). I am using a DTS package to do this, but when run it will fail when it hits a PK violation. How do I getround this??????? What simple thing am i missing??
Next month client is moving servers to new location. The servers will get new ip and subnet Our user db's which are on a SAN will be moving to new SAN. My plan is the following, please correct any mistakes.
1. Do full backups of everything 2. Detach user db's 3. Copy these files to usb box (100 gig worth) 4. After server is in new world bring it up 5. Copy db files from usb 6. Attach db files
I'm a C# developer, not a DBA, but fairly familiar with sql server 2000 and learning 2005. I have a need to update a table on one sql server from another remote sql 2005 server. Both will be SQL 2005, but will not be on the same network, so there will be firewall issues to contend with. Security is not a huge concern as the data is public info and not personal secure information or anything like that. What I'll have is a table of data in one sql database. I will have a copy of that data on another sql server that is used at an ISP and drives a web site. I need the table at the ISP sql server to get updated from the SQL server that is onsite in our office on our private network. I would like to not have to get network admins to open sql server ports if possible, so I guess I'm wondering if there is an easy way with SQL 2005 to update data from one sql server to another in the environment that I have described that maybe could expose the data over a common port (ie port 80). I don't want to have to use integrated security as I'm not sure this would work between remote databases in this scenario. I don't know if the web service end point support in sql 2005 would allow this. The examples I have seen online show a sql endpoint created and then accessed from a .NET application. I can do that if I have to, but I would rather do all of this inside the database in some fashion. I am open to any idea that would work and if I need to, I can ditch the idea of using something over port 80 and can get the necessary ports open between the sql servers. This is a pretty simple scenario that we have with one table of data needing to be updated weekly on a web based sql server from a remote sql server.
Any ideas on how you would approach this would be greatly appreciated. Security for the data I'm accessing is not a huge concern as the data is public domain as I've said, but I need to make sure the rest of the server and database on both ends are secured.
hi, I currently uses SQL Server 2005.I need to move the data to my website, how do I do that? I have the SQL Server Management Studio Express but I couldn't not get it to export the data in .sql file (which is needed to import to my website's SQL database).Please let me know of any tools I need to perform the task. thanks.
Currently we update database A everyday with tables from database B. Database is India and has poor connection speed. What would be the best method of moving this data. Currently it's about 100 tables and on average its about 900 mb. Any input is greatly appreciated. thanks,
Hello everybody We need to move table T1 from database A to T1 database B on same server
size of table T1 15 GB and 40000000 rows
database B just created and will act as warehouse
could it be done simply by 1.creating table T1 on db B and then 2.set db to simple recovery 3. insert into B.dbo.T1 select * from A.dbo.T1 4. create all the indexes on table T1 in db B
We are currently moving our environment. I was told that we need to copy all the stuff over. More specifically, we want to move everything except the data, as our data is dynamic and will fill up in a few days by itself.
What is the best way to move everything over from one server instance to another?
My current approach is the following:
1. Create the file groups we have on our current server on the new server 2. Script out all databases with stored procedures, functions, views, priviliges, indexes ... 3. Script out all the jobs 4. Script out all the dts packages (or rather save each in a file) 5. Load all scripts into the new sql server 6. Re-create user accounts (can these be scripted out also and then loaded?)
Am I missing something or is there a wiser alternative?
I have two databases DB_1 and DB_2 on my server. Both of them have 25 identical tables.
Tables under DB_2 gets refreshed with new data daily and once it's ready with new data, those data should be copied to DB_1 tables (deleting any existing data and loading it with new data).
My question is there any other way other than using Import Data Wizard for moving the data between two databases which included deleting any existing data on DB_1 tables and loadit with new data.
I'd like to move data from my prod. env. to my dev. env. The data in dev would be replaced by the prod one. I cannot do a detach/attach or backup and restore due to some already existing dev objects located in the dev. env.
I have two databases located on the same server. I was able to script a identical table in the other database, but how do I transfer the table data there?
I'm using a legacy application built using VB5 and SQL Server 7. After recompiling it, and putting the database in SQL Server 2012. I want to access the current login user using the SQL function SUSER_SNAME().
This is the code.
Set rdoRes = goDatabase.Connection.OpenResultset("select suser_sname()")
But I'm unable to get the current user login in the application. If I write any other SQL statement instead of this, then it runs. But only this statement is not running. Is there any security reasons for this?
I have a DataReader on a web page: <asp:SqlDataSource ID="ChangeInfoRecord" runat="server" ConnectionString.......... I want to display the value of one of the returned field in a text box. What is the correct syntax? I tried RolledFromDisplay.Text = ChangeInfoRecord("RolledFrom") but that isn't it.
Using Stored Procedure ~ I'm trying to get data from Database1 into Database2 from the column name (ManufacturerID).
Column Names are the Same in both Databases :e.g. Database#1(Empire1.dbo.Location.ManufacturerID) e.g. Database#2(Empire2.dbo.Products.ManufacturerID)Heres what I've tried but is throwing me an error...SELECT ManufacturerID INTO Empire2FROM Empire1.dbo.ProductsINSERT INTO Location.ManufacturerID
Okay, after I got everything imported, I found that a few thousand columns had "Shifted" on me. So now I am trying to "Shift them over" to where they need to be.
I did this: INSERT INTO dbo.TABLENAME (COLUMN_NAME_TO_BE POPULATED) SELECT COLUM_NAME_OF_INFO_TO_BE_MOVED FROM dbo.TABLENAME WHERE MFG = 'MANUFACTURER_NAME' AND PN LIKE '8888888888' GO
I populated the PN column with 8888888888 to use as a reference point, and the MFG column already was populated with the correct name, so I was using those as my unique reference points.
Other columns that have the same MFG name are correct, so I have to use two unique identifiers to specify the actual data that needs to be moved.
It inserted 'NULL' in the whole table after I ran it in the Query Analyzer. It appeared to disregard the WHERE statement all together Any ideas on what I am doing wrong here?
Is there a way to move the data from one column to the next one over by specifying other WHERE criterias?
I'm attempting to move data from an Oracle table to an Access table using an Oracle stored procedure in DTS. The problem is that you can't pass parameters to an Oracle stored proc when its called in a data pump task. Is there a way to pass global variables into an Oracle stored procedure which retrieves data and moves it to an Access database? Maybe in an Active X task? We are required to use an Oracle stored procedure by our DBA's or else I'd just pass the variables into a SQL string and use it in the data pump task.
We need to move duplicate data from this sheet to other table also having issue that sometime verifiedmemberID is null as well as verifiermember name is null and also having the values in BCP authorisationcode as well as FPoveridecode but transactionmode/bcpmode is 'n' and also having condition that transactionmode/bcpmode is 'y' but bcpauthorisationcode is blank.