Errors From Command Files
Nov 10, 1998
Help!
I have a scheduled command exec that executes every minute. If the execution is successful it returns zero or non-zero if error. I would like to raise an error, that would be trapped as an alert to take corrective action.
View 1 Replies
ADVERTISEMENT
Feb 24, 2015
I have the need to delete old backup files via TSQL job. Found this solution online:
PushD "
emoteservershareDIFF" &&(
forfiles -m *DIFF*.sqb -d -1 -c "cmd /c del /q @path"
) & PopD
It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?
View 6 Replies
View Related
Dec 23, 2005
I have created ssis package. It has been successfully running at UI level.
But when i am trying to execute it from the command prompt by using dtexec utility it is showing the following error messages.
Error: 2005-12-23 17:01:57.67
Code: 0xC00470FE
Source: Data Flow Task DTS.Pipeline
Description: The product level is insufficient for component "Flat File Source" (1).
End Error
Error: 2005-12-23 17:01:57.67
Code: 0xC00470FE
Source: Data Flow Task DTS.Pipeline
Description: The product level is insufficient for component "Script Component" (9).
End Error
i have entered the command as follows.
dtexec /f "c:somefolderpackage1.dtsx"
Any points regarding this issue would be helpful.
View 7 Replies
View Related
Jul 23, 2005
Hi All,I want to shrink a transaction log using :USE MasterBACKUP Log test WITH truncate_onlyUSE testDBCC shrinkfile(test_log, 0)These commands will be executed from a workstation.After shrinking, I want to see the size of thetransaction log and database files. But, I do notknow what command should I use.Could anyone help me ?Thanks in advanceJohn S.*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
View 1 Replies
View Related
Dec 12, 2005
I am using installshield to distribute SQL Server 2005 Express. I have the SQLEXPR.EXE file and I want to run it in /qb mode so the user can see the pretty dialogs pop up but not have to click anything.
View 6 Replies
View Related
Oct 7, 2014
I have SSRS Log files under
c:MSSQLMSRS11.MSSQLSERVERReporting ServicesLogFiles
I am getting frequently errors in report manager. When I open that log files, I can see some old logfiles in 2013. So now how to enable that logging feature again.
i.e. before that they enabled the logging in reportserver and disabled now?
View 3 Replies
View Related
Jul 20, 2005
I have several *.sql files with schema/data changes to be applied to ourcurrent database. Is there a way to create a TSQL script that could be runfrom the SQL Query Analyzer that would sequentially call the *.sql files?i.e.call schemaVersionCheck.sqlcall addFieldToLoanTable.sqlcall updateLoanTrigger.sqlcall updateSchemaVersiontia.dynoweb
View 2 Replies
View Related
Aug 8, 2006
Hi,
How do I execute all the .sql scripts in a folder with OSQL command?
Thanks in advance,
Hari Haran Arulmozhi
View 4 Replies
View Related
Jul 26, 2006
I'm currently experiencing major problems with SSIS when opening and editing large .DTSX package files that contain Exec DTS 2000 Tasks which have the package data loaded internally. I have no issues if I point the task to a .DTS file, or to an actual DTS package on a SQL 2000 server - but if I load the package internally then once the underlying .DTSX file gets over around 17MB or so in size (which doesnt take long making a few edits to even fairly simple packages now and then), I start to experience major issues with VS/BIDS 2005 crashing randomly when I try to perform any action with the package (open, save etc). Things like OutOfMemory exception errors, followed by the properties of Exec DTS 2000 task being deleted, and also sometimes accompanied by messages about the application not being installed properly.
Again its ONLY when the underlying .DTSX file reaches a certain size limit, and only when I've got an Exec DTS 2000 task with the package loaded internally. I've replicated the issue using several different package files on several different machines (even on servers with lots of memory, fwiw).
Can anyone out there help me with this? SSIS - namely SSIS Exec DTS 2000 package tasks - are our lifeblood at my company and this trend of random and serious crashing on large package files is very disturbing to say the least.
thanks,
Wil
View 1 Replies
View Related
Dec 10, 2013
I am using Bulk Copy command for Exporting data table wise from database to csv files and it was working fine. Since last 3-4 days when exporting for some tables data in csv file is coming junk.
View 1 Replies
View Related
Jul 1, 2015
I recently updated the datatype of a sproc parameter from bit to tinyint. When I executed the sproc with the updated parameters the sproc appeared to succeed and returned "1 row(s) affected" in the console. However, the update triggered by the sproc did not actually work.
The table column was a bit which only allows 0 or 1 and the sproc was passing a value of 2 so the table was rejecting this value. However, the sproc did not return an error and appeared to return success. So is there a way to configure the database or sproc to return an error message when this type of error occurs?
View 1 Replies
View Related
Jul 31, 2006
I have a parent package that calls child packages inside a For Each container. When I debug/run the parent package (from VS), I get the following error message: Warning: The Execution method succeeded, but the number of errors raised (3) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
It appears to be failing while executing the child package. However, the logs (via the "progress" tab) for both the parent package and the child package show no errors other than the one listed above (and that shows in the parent package log). The child package appears to validate completely without error (all components are green and no error messages in the log). I turned on SSIS logging to a text file and see nothing in there either.
If I bump up the MaximumErrorCount in the parent package and in the Execute Package Task that calls the child package to 4 (to go one above the error count indicated in the message above), the whole thing executes sucessfully. I don't want to leave the Max Error Count set like this. Is there something I am missing? For example are there errors that do not get logged by default? I get some warnings, do a certain number of warnings equal an error?
Thanks,
Lee
View 5 Replies
View Related
Apr 20, 2006
Starwin writes "when i execute DBCC CHECKDB, DBCC CHECKCATALOG
I reveived the following error.
how to solve it?
Server: Msg 8909, Level 16, State 1, Line 1 Table error: Object ID -2093955965, index ID 711, page ID (3:2530). The PageId in the page header = (34443:343146507).
. . . .
. . . .
CHECKDB found 0 allocation errors and 1 consistency errors in table '(Object ID -1635188736)' (object ID -1635188736).
CHECKDB found 0 allocation errors and 1 consistency errors in table '(Object ID -1600811521)' (object ID -1600811521).
. . . .
. . . .
Server: Msg 8909, Level 16, State 1, Line 1 Table error: Object ID -8748568, index ID 50307, page ID (3:2497). The PageId in the page header = (26707:762626875).
Server: Msg 8909, Level 16, State 1, Line 1 Table error: Object ID -7615284, index ID 35836, page ID (3:2534). The PageId in the page heade"
View 1 Replies
View Related
Feb 23, 2007
i am using visual web developer 2005 and SQL 2005 with VB as the code behindi am using INSERT command like this Dim test As New SqlDataSource() test.ConnectionString = ConfigurationManager.ConnectionStrings("DatabaseConnectionString1").ToString() test.InsertCommandType = SqlDataSourceCommandType.Text test.InsertCommand = "INSERT INTO try (roll,name, age, email) VALUES (@roll,@name, @age, @email) " test.InsertParameters.Add("roll", TextBox1.Text) test.InsertParameters.Add("name", TextBox2.Text) test.InsertParameters.Add("age", TextBox3.Text) test.InsertParameters.Add("email", TextBox4.Text) test.Insert() i am using UPDATE command like this Dim test As New SqlDataSource() test.ConnectionString = ConfigurationManager.ConnectionStrings("DatabaseConnectionString").ToString() test.UpdateCommandType = SqlDataSourceCommandType.Text test.UpdateCommand = "UPDATE try SET name = '" + myname + "' , age = '" + myage + "' , email = '" + myemail + "' WHERE roll 123 " test.Update()but i have to use the SELECT command like this which is completely different from INSERT and UPDATE commands Dim tblData As New Data.DataTable() Dim conn As New Data.SqlClient.SqlConnection("Data Source=.SQLEXPRESS;AttachDbFilename=|DataDirectory|Database.mdf;Integrated Security=True;User Instance=True") Dim Command As New Data.SqlClient.SqlCommand("SELECT * FROM try WHERE age = '100' ", conn) Dim da As New Data.SqlClient.SqlDataAdapter(Command) da.Fill(tblData) conn.Close() TextBox4.Text = tblData.Rows(1).Item("name").ToString() TextBox5.Text = tblData.Rows(1).Item("age").ToString() TextBox6.Text = tblData.Rows(1).Item("email").ToString() for INSERT and UPDATE commands defining the command,commandtype and connectionstring is samebut for the SELECT command it is completely different. why ?can i define the command,commandtype and connectionstring for SELECT command similar to INSERT and UPDATE ?if its possible how to do ?please help me
View 2 Replies
View Related
Nov 4, 2006
Hi All,
i am using a OLE DB Source in my dataflow component and want to select rows from the source based on the Name I enter during execution time. I have created two variables,
enterName - String packageLevel (will store the name I enter)
myVar - String packageLevel. (to store the query)
I am assigning this query to the myVar variable, "Select * from db.Users where (UsrName = " + @[User::enterName] + " )"
Now in the OLE Db source, I have selected as Sql Command from Variable, and I am getting the variable, enterName,. I select that and when I click on OK am getting this error.
Error at Data Flow Task [OLE DB Source [1]]: An OLE DB error has occurred. Error code: 0x80040E0C.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E0C Description: "Command text was not set for the command object.".
Can Someone guide me whr am going wrong?
myVar variable, i have set the ExecuteAsExpression Property to true too.
Please let me know where am going wrong?
Thanks in advance.
View 12 Replies
View Related
Aug 30, 2004
Do somebody know how long (in chars) script(command) can be solved by SQL Command?
Thanks
View 1 Replies
View Related
Sep 19, 2006
Hi. I am writing a program in C# to migrate data from a Foxpro database to an SQL Server 2005 Express database. The package is being created programmatically. I am creating a separate data flow for each Foxpro table. It seems to be doing it ok but I am getting the following error message at the package validation stage:
Description: An OLE DB Error has occured. Error code: 0x80040E0C.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E0C Description: "Command text was not set for the command object".
.........
Description: "component "OLE DB Destination" (22)" failed validation and returned validation status "VS_ISBROKEN".
This is the first time I am writing such code and I there must be something I am not doing correct but can't seem to figure it out. Any help will be highly appreciated. My code is as below:
private bool BuildPackage()
{
// Create the package object
oPackage = new Package();
// Create connections for the Foxpro and SQL Server data
Connections oPkgConns = oPackage.Connections;
// Foxpro Connection
ConnectionManager oFoxConn = oPkgConns.Add("OLEDB");
oFoxConn.ConnectionString = sSourceConnString; // Created elsewhere
oFoxConn.Name = "SourceConnectionOLEDB";
oFoxConn.Description = "OLEDB Connection For Foxpro Database";
// SQL Server Connection
ConnectionManager oSQLConn = oPkgConns.Add("OLEDB");
oSQLConn.ConnectionString = sTargetConnString; // Created elsewhere
oSQLConn.Name = "DestinationConnectionOLEDB";
oSQLConn.Description = "OLEDB Connection For SQL Server Database";
// Add Prepare SQL Task
Executable exSQLTask = oPackage.Executables.Add("STOCK:SQLTask");
TaskHost thSQLTask = exSQLTask as TaskHost;
thSQLTask.Properties["Connection"].SetValue(thSQLTask, "oSQLConn");
thSQLTask.Properties["DelayValidation"].SetValue(thSQLTask, true);
thSQLTask.Properties["ResultSetType"].SetValue(thSQLTask, ResultSetType.ResultSetType_None);
thSQLTask.Properties["SqlStatementSource"].SetValue(thSQLTask, @"C:LPFMigrateLPF_Script.sql");
thSQLTask.Properties["SqlStatementSourceType"].SetValue(thSQLTask, SqlStatementSourceType.FileConnection);
thSQLTask.FailPackageOnFailure = true;
// Add Data Flow Tasks. Create a separate task for each table.
// Get a list of tables from the source folder
arFiles = Directory.GetFileSystemEntries(sLPFDataFolder, "*.DBF");
for (iCount = 0; iCount <= arFiles.GetUpperBound(0); iCount++)
{
// Get the name of the file from the array
sDataFile = Path.GetFileName(arFiles[iCount].ToString());
sDataFile = sDataFile.Substring(0, sDataFile.Length - 4);
oDataFlow = ((TaskHost)oPackage.Executables.Add("DTS.Pipeline.1")).InnerObject as MainPipe;
oDataFlow.AutoGenerateIDForNewObjects = true;
// Create the source component
IDTSComponentMetaData90 oSource = oDataFlow.ComponentMetaDataCollection.New();
oSource.Name = (sDataFile + "Src");
oSource.ComponentClassID = "DTSAdapter.OLEDBSource.1";
// Get the design time instance of the component and initialize the component
CManagedComponentWrapper srcDesignTime = oSource.Instantiate();
srcDesignTime.ProvideComponentProperties();
// Add the connection manager
if (oSource.RuntimeConnectionCollection.Count > 0)
{
oSource.RuntimeConnectionCollection[0].ConnectionManagerID = oFoxConn.ID;
oSource.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(oFoxConn);
}
// Set Custom Properties
srcDesignTime.SetComponentProperty("AccessMode", 0);
srcDesignTime.SetComponentProperty("AlwaysUseDefaultCodePage", true);
srcDesignTime.SetComponentProperty("OpenRowset", sDataFile);
// Re-initialize metadata
srcDesignTime.AcquireConnections(null);
srcDesignTime.ReinitializeMetaData();
srcDesignTime.ReleaseConnections();
// Create Destination component
IDTSComponentMetaData90 oDestination = oDataFlow.ComponentMetaDataCollection.New();
oDestination.Name = (sDataFile + "Dest");
oDestination.ComponentClassID = "DTSAdapter.OLEDBDestination.1";
// Get the design time instance of the component and initialize the component
CManagedComponentWrapper destDesignTime = oDestination.Instantiate();
destDesignTime.ProvideComponentProperties();
// Add the connection manager
if (oDestination.RuntimeConnectionCollection.Count > 0)
{
oDestination.RuntimeConnectionCollection[0].ConnectionManagerID = oSQLConn.ID;
oDestination.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(oSQLConn);
}
// Set custom properties
destDesignTime.SetComponentProperty("AccessMode", 2);
destDesignTime.SetComponentProperty("AlwaysUseDefaultCodePage", false);
destDesignTime.SetComponentProperty("OpenRowset", "[dbo].[" + sDataFile + "]");
// Create the path to link the source and destination components of the dataflow
IDTSPath90 dfPath = oDataFlow.PathCollection.New();
dfPath.AttachPathAndPropagateNotifications(oSource.OutputCollection[0], oDestination.InputCollection[0]);
// Iterate through the inputs of the component.
foreach (IDTSInput90 input in oDestination.InputCollection)
{
// Get the virtual input column collection
IDTSVirtualInput90 vInput = input.GetVirtualInput();
// Iterate through the column collection
foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)
{
// Call the SetUsageType method of the design time instance of the component.
destDesignTime.SetUsageType(input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READWRITE);
}
//Map external metadata to the inputcolumn
foreach (IDTSInputColumn90 inputColumn in input.InputColumnCollection)
{
IDTSExternalMetadataColumn90 externalColumn = input.ExternalMetadataColumnCollection.New();
externalColumn.Name = inputColumn.Name;
externalColumn.Precision = inputColumn.Precision;
externalColumn.Length = inputColumn.Length;
externalColumn.DataType = inputColumn.DataType;
externalColumn.Scale = inputColumn.Scale;
// Map the external column to the input column.
inputColumn.ExternalMetadataColumnID = externalColumn.ID;
}
}
}
// Add precedence constraints to the package executables
PrecedenceConstraint pcTasks = oPackage.PrecedenceConstraints.Add((Executable)thSQLTask, oPackage.Executables[0]);
pcTasks.Value = DTSExecResult.Success;
for (iCount = 1; iCount <= (oPackage.Executables.Count - 1); iCount++)
{
pcTasks = oPackage.PrecedenceConstraints.Add(oPackage.Executables[iCount - 1], oPackage.Executables[iCount]);
pcTasks.Value = DTSExecResult.Success;
}
// Validate the package
DTSExecResult eResult = oPackage.Validate(oPkgConns, null, null, null);
// Check if the package was successfully executed
if (eResult.Equals(DTSExecResult.Canceled) || eResult.Equals(DTSExecResult.Failure))
{
string sErrorMessage = "";
foreach (DtsError pkgError in oPackage.Errors)
{
sErrorMessage = sErrorMessage + "Description: " + pkgError.Description + "";
sErrorMessage = sErrorMessage + "HelpContext: " + pkgError.HelpContext + "";
sErrorMessage = sErrorMessage + "HelpFile: " + pkgError.HelpFile + "";
sErrorMessage = sErrorMessage + "IDOfInterfaceWithError: " + pkgError.IDOfInterfaceWithError + "";
sErrorMessage = sErrorMessage + "Source: " + pkgError.Source + "";
sErrorMessage = sErrorMessage + "Subcomponent: " + pkgError.SubComponent + "";
sErrorMessage = sErrorMessage + "Timestamp: " + pkgError.TimeStamp + "";
sErrorMessage = sErrorMessage + "ErrorCode: " + pkgError.ErrorCode;
}
MessageBox.Show("The DTS package was not built successfully because of the following error(s):" + sErrorMessage, "Package Builder", MessageBoxButtons.OK, MessageBoxIcon.Information);
return false;
}
// return a successful result
return true;
}
View 2 Replies
View Related
Jan 9, 2015
I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.
View 2 Replies
View Related
Mar 13, 2008
In the For Loop, How to Iterate from Older flat files to Newer flat files based on File's Timestamp. If there are some older files in that folder, it should be processed first and then continue with the newer one.
Any Suggestions?
View 3 Replies
View Related
Apr 24, 2008
In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this:
1. If no file(s) found, stop executing and send email saying no file(s) found;
2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1:
----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Imports Microsoft.VisualBasic.FileIO.FileSystem
Imports System.IO.FileSystemInfo
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String
Dim cFileType As String
Dim cFileFlgVar As String
WriteVariable("SCFileFlg", False)
WriteVariable("OOFileFlg", False)
WriteVariable("INFileFlg", False)
WriteVariable("IAFileFlg", False)
WriteVariable("RCFileFlg", False)
cDataFileName = ReadVariable("DataFileName").ToString
cFileType = Left(Right(cDataFileName, 4), 2)
cFileFlgVar = cFileType.ToUpper + "FileFlg"
WriteVariable(cFileFlgVar, True)
Dts.TaskResult = Dts.Results.Success
End Sub
Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object)
Try
Dim vars As Variables
Dts.VariableDispenser.LockForWrite(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
vars(varName).Value = varValue
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
End Sub
Private Function ReadVariable(ByVal varName As String) As Object
Dim result As Object
Try
Dim vars As Variables
Dts.VariableDispenser.LockForRead(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
result = vars(varName).Value
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
Return result
End Function
End Class
example for step 2:
-------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
'Set the properties like username & password
cm.Properties("ServerName").SetValue(cm, "ftp.name.com")
cm.Properties("ServerUserName").SetValue(cm, "username")
cm.Properties("ServerPassword").SetValue(cm, "password")
cm.Properties("ServerPort").SetValue(cm, "21")
cm.Properties("Timeout").SetValue(cm, "0") 'The 0 setting will make it not timeout
cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb
cm.Properties("Retries").SetValue(cm, "1")
'create the FTP object that sends the files and pass it the connection created above.
Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))
'Connects to the ftp server
ftp.Connect()
'ftp.SetWorkingDirectory("..")
ftp.SetWorkingDirectory("directoryname")
Dim folderNames() As String
Dim fileNames() As String
ftp.GetListing(folderNames, fileNames)
Dim maxname As String = ""
For Each filename As String In fileNames
' whatever operation you need to do to find the correct file...
Next
Dim files(0) As String
files(0) = maxname
ftp.ReceiveFiles(files, "C: emp", True, True)
' Close the ftp connection
ftp.Close()
'Set the filename you retreive for use in data flow
Dts.Variables.Item("FILENAME").Value = maxname
Catch ex As Exception
Dts.TaskResult = Dts.Results.Failure
End Try
Dts.TaskResult = Dts.Results.Success
End Sub
End Class
View 16 Replies
View Related
Oct 8, 2007
Brief overview...Running SQL Server 2003 Server Enterprise 64 bit - All Service Packs and patches current
SQL Server 2005 Enterprise Edition 64 bit Build Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)
I cannot import any SSIS packages nor crete any new folders under stored packages. I hve googled the news groups and looked at BOL to no avail. HELP!!!!
View 20 Replies
View Related
Jul 23, 2005
I am thinking about replacing the INSERT data scriptfiles that I have with XML files. This way I can open the XMLfile using an XML Editor and see the values in a GRID andmake changes easier.Do you see any problem with this approach?I managed to put together some code that is exportinga SQL table with its data to an XML file and also a codethat reads the XML file's data and inserts it into a table.Now I am researching on XSD, td:datatype, DTD...(I am new to XML) in order to figure out how I canuse a single xml file that will hold both the sql serverfields, the datatypes and their values.If you have links to some sample code that has anythingto do with the datatype export and import I am workingon, can you please share them with me?Most importantly what do you think about the idea of usingXML files vs sql scripts?Thank you
View 4 Replies
View Related
Jun 25, 2015
I have a scenario where I need to convert RDF files to PDF files? may I know is this achievable in SSIS - writing C# code?
View 6 Replies
View Related
Oct 1, 2004
I am wanting to reduce the amount of Virtual Log Files I have. In reading through the Online Book Documentation, I realize that I have forgotten to move the Transaction Log Files to a different drive. Now that the server is in production, I wanted to get some input about the best way of making this change.
Can I just change the directory the log files are being written to in the DB properties without having any adverse problems occurring?
View 2 Replies
View Related
Sep 17, 2004
Hi!
I'm using replication with two database on SQL 2000,when begin, the log files size is 50mb and the data files size is 150mb. But now the log files size is 2Gb and the data files size is 4Gb. I would like to decrease the log files and the data files ??? How do i do this???
(I using Truncate and shrink doesn't change )
Thanks!!!
View 2 Replies
View Related
Dec 11, 2005
I need to write codes to access many image files and wave files and display them on the webpage. Both of these files are stored on the server (as many experts in this forum adviced) and their locations are in the SQL server database. About the wav files, I think I will have to convert them to MP3 first for fast delivery on internet. Since I am a newbie, any help would be appreciated.
Thanks,
View 14 Replies
View Related
Jul 20, 2005
I have a problem with bcp and format files.We changed our databases from varchar to nvarchar to support unicode. Noproblems so fare with that. It is working fine.But now I need a format file for the customer table and and it is notworking. It is working fine with the old DB with varchar, but withnvarchar I'm not able to copy the data. The biggest problem is, that Igot no error message. BCP starts copying to table and finished withouterror message.This is my table:CREATE TABLE [dbo].[Customer] ([ID] [int] NOT NULL ,[CreationTime] [datetime] NULL ,[ModificationTime] [datetime] NULL ,[DiscoveryTime] [datetime] NULL ,[Name_] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AI NULL ,[Class] [int] NULL ,[Subclass] [int] NULL ,[Capabilities] [int] NULL ,[SnapshotID] [int] NOT NULL ,[CompanyName] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AI NOTNULL ,[TargetRCCountry] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AINOT NULL ,[LocationID] [int] NULL ,[MirrorID] [binary] (16) NULL ,[DeleteFlag] [bit] NULL ,[AdminStatus] [bit] NULL) ON [PRIMARY]GOand this is the format file:8.0131 SQLINT 1 12 "#~@~#" 1 ID ""2 SQLDATETIME 1 24 "#~@~#" 2 CreationTime ""3 SQLDATETIME 1 24 "#~@~#" 3 ModificationTime ""4 SQLDATETIME 1 24 "#~@~#" 4 DiscoveryTime ""5 SQLNCHAR 2 510 "#~@~#" 5 Name_SQL_Latin1_General_CP1_CI_AS6 SQLINT 1 12 "#~@~#" 6 Class ""7 SQLINT 1 12 "#~@~#" 7 Subclass ""8 SQLINT 1 12 "#~@~#" 8 Capabilities ""9 SQLINT 1 12 "#~@~#" 9 SnapshotID ""10 SQLNCHAR 2 510 "#~@~#" 10 CompanyNameSQL_Latin1_General_CP1_CI_AS11 SQLNCHAR 2 510 "#~@~#" 11 TargetRCCountrySQL_Latin1_General_CP1_CI_AS12 SQLINT 1 12 "#~@~#" 12 LocationID ""13 SQLBINARY 1 33 "#~@~#
"13 MirrorID """#~@~#" is the field terminator. We have a lot of text files with allkind of charachers in it. So we think this is a set that will neveroccur in our files.Thanks for your help!*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
View 2 Replies
View Related
Jun 14, 2007
Hi All,
Are there any MDS Descriptor Files associated with Databases Files? Are these associated with transaction logs? As per my information there are .mdf , .ndf and .ldf files.Could anyone please guide me ASAP,as there is an urgent requirement from one of the clients?
Thanks in Advance
Gaurav Matkar
View 5 Replies
View Related
Jan 21, 2006
Is there a way to do this with some sort of task ?
View 7 Replies
View Related
Nov 8, 2006
I'm using script task to rename files inside foreach loop files,code like shown below
Dim file As New System.IO.FileInfo(CStr(Dts.Variables("User::FileName").Value))
dim newname as string = Split(file.Name, "_")(0) & ".jpg"
Dts.Variables("User::outputname").Value = file.DirectoryName & "" & newname
Then,i use ftp task to send files,but prompt "the variables User::outputname doesn't contains file path(s)"
I tried to use file system task to perform that,but failed either
Rename file operation in file system only can rename a file in a specified location,who can help me?
TIA
View 1 Replies
View Related
Sep 3, 2007
Hi gurus,
I've created a linked server (and set up the corresponding schema.ini file) in order to perform bulk-inserts from some CSV text files into SQL tables (from my standpoint the text files are just for reading purposes). The linked server works fine (I can select the data in the files without a problem).
Now the question: is possible to automatically detect when one or more of those files change in order to start the import process automatically? Something like having a trigger created on the CSV files Or there's no easy way to do that so I have, to say something, to create a Job that periodically checks if the files have changed programatically (say, recording each file's timestamp everytime is imported and comparing the recorded value with the current one, or whatever)?
Thanks a lot in advance!
View 1 Replies
View Related
Jan 22, 2008
Hello,
I might have a rather strange question but I was wondering if there is a way/tool that can be used to convert/import Word documents (.doc) into Reporting Services report files (.rdl). The problem is that I have lots of Word documents containing e.g. survey question. The reports are very extensive with many pages and many "graphical" objects e.g. rectangles etc.. Now I want to have all these reports imported into Reporting Services, i.e. I want the .doc-files to be converted to .rdl-files. I do not want to use e.g. OfficeWriter but I want the .doc-files to be migrated to the rdl-format. Is that possible, is there a tool for this? Both file formats are Microsoft formats, so there should be a conversation tool right?
Best Regards
Lars
View 13 Replies
View Related