I might have a rather strange question but I was wondering if there is a way/tool that can be used to convert/import Word documents (.doc) into Reporting Services report files (.rdl). The problem is that I have lots of Word documents containing e.g. survey question. The reports are very extensive with many pages and many "graphical" objects e.g. rectangles etc.. Now I want to have all these reports imported into Reporting Services, i.e. I want the .doc-files to be converted to .rdl-files. I do not want to use e.g. OfficeWriter but I want the .doc-files to be migrated to the rdl-format. Is that possible, is there a tool for this? Both file formats are Microsoft formats, so there should be a conversation tool right?
How to convert SQL trace files into excel files without doing any work on SQL Profiler / SQL server using any scripting code. Consider that we only have SQL Trace files. What are the steps involved in converting into .CSV format using single "CLICK"
My task is to convert jpeg's to binary and then insert them into a table called "images". I need to convert/insert all jpeg files in a directory. I'm able to accomplish the task if the files are numbered. The query below works by retrieving one file at a time based on the value of @i. However, I also have directories where the files are not numbered but have ordinary text names like "Red_Sofa.jpg". I need to iterate through these directories as well and convert/insert the jpeg's. I'm running SSMS 2014 Express on 4.0 and Windows 7.
1) What is the current version of SQL Server Express? 2) HOw much SQL Server Express costs (figure about 500 branch servers) 3)Are there any Tools from Microsoft to convert Access 97 directly to SQL Server express, and how much do they cost? 4)Server Hardware requirements to run SQL Server Express - disk size, memory size, security settings, pre-requisite, service needed. 5)Maximum capacity / capabilities of SQL Server Express-max # of simultaneous users,tables,rows,database size. 6)Any installation instruction for SQL Server Express.
I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.
In the For Loop, How to Iterate from Older flat files to Newer flat files based on File's Timestamp. If there are some older files in that folder, it should be processed first and then continue with the newer one.
In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this: 1. If no file(s) found, stop executing and send email saying no file(s) found; 2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1: ----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task ' Write scripts using Microsoft Visual Basic ' The ScriptMain class is the entry point of the Script Task.
' The execution engine calls this method when the task executes. ' To access the object model, use the Dts object. Connections, variables, events, ' and logging features are available as static members of the Dts class. ' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure. ' ' To open Code and Text Editor Help, press F1. ' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String Dim cFileType As String Dim cFileFlgVar As String WriteVariable("SCFileFlg", False) WriteVariable("OOFileFlg", False) WriteVariable("INFileFlg", False) WriteVariable("IAFileFlg", False) WriteVariable("RCFileFlg", False) cDataFileName = ReadVariable("DataFileName").ToString cFileType = Left(Right(cDataFileName, 4), 2) cFileFlgVar = cFileType.ToUpper + "FileFlg" WriteVariable(cFileFlgVar, True) Dts.TaskResult = Dts.Results.Success End Sub Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object) Try Dim vars As Variables Dts.VariableDispenser.LockForWrite(varName) Dts.VariableDispenser.GetVariables(vars) Try vars(varName).Value = varValue Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try End Sub Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
example for step 2: -------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?
Brief overview...Running SQL Server 2003 Server Enterprise 64 bit - All Service Packs and patches current SQL Server 2005 Enterprise Edition 64 bit Build Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)
I cannot import any SSIS packages nor crete any new folders under stored packages. I hve googled the news groups and looked at BOL to no avail. HELP!!!!
I am thinking about replacing the INSERT data scriptfiles that I have with XML files. This way I can open the XMLfile using an XML Editor and see the values in a GRID andmake changes easier.Do you see any problem with this approach?I managed to put together some code that is exportinga SQL table with its data to an XML file and also a codethat reads the XML file's data and inserts it into a table.Now I am researching on XSD, td:datatype, DTD...(I am new to XML) in order to figure out how I canuse a single xml file that will hold both the sql serverfields, the datatypes and their values.If you have links to some sample code that has anythingto do with the datatype export and import I am workingon, can you please share them with me?Most importantly what do you think about the idea of usingXML files vs sql scripts?Thank you
I have been strunggling to find solution to convert XLSX files with multiple sheets to csv file.
Requirements >> Convert XLSX file with multiple sheets to CSV file >> CSV file names : XLSX filename + '_' + sheet name >> scirpt has to be in VB as i am using ssis 2005 >> I started develping scirpt using Micorosoft.office.interop.Excel.dll . this dll is referenced to script task. >> found web link as useful.. [URL] ....
I am wanting to reduce the amount of Virtual Log Files I have. In reading through the Online Book Documentation, I realize that I have forgotten to move the Transaction Log Files to a different drive. Now that the server is in production, I wanted to get some input about the best way of making this change.
Can I just change the directory the log files are being written to in the DB properties without having any adverse problems occurring?
Hi! I'm using replication with two database on SQL 2000,when begin, the log files size is 50mb and the data files size is 150mb. But now the log files size is 2Gb and the data files size is 4Gb. I would like to decrease the log files and the data files ??? How do i do this??? (I using Truncate and shrink doesn't change ) Thanks!!!
I need to write codes to access many image files and wave files and display them on the webpage. Both of these files are stored on the server (as many experts in this forum adviced) and their locations are in the SQL server database. About the wav files, I think I will have to convert them to MP3 first for fast delivery on internet. Since I am a newbie, any help would be appreciated. Thanks,
I have a problem with bcp and format files.We changed our databases from varchar to nvarchar to support unicode. Noproblems so fare with that. It is working fine.But now I need a format file for the customer table and and it is notworking. It is working fine with the old DB with varchar, but withnvarchar I'm not able to copy the data. The biggest problem is, that Igot no error message. BCP starts copying to table and finished withouterror message.This is my table:CREATE TABLE [dbo].[Customer] ([ID] [int] NOT NULL ,[CreationTime] [datetime] NULL ,[ModificationTime] [datetime] NULL ,[DiscoveryTime] [datetime] NULL ,[Name_] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AI NULL ,[Class] [int] NULL ,[Subclass] [int] NULL ,[Capabilities] [int] NULL ,[SnapshotID] [int] NOT NULL ,[CompanyName] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AI NOTNULL ,[TargetRCCountry] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AINOT NULL ,[LocationID] [int] NULL ,[MirrorID] [binary] (16) NULL ,[DeleteFlag] [bit] NULL ,[AdminStatus] [bit] NULL) ON [PRIMARY]GOand this is the format file:8.0131 SQLINT 1 12 "#~@~#" 1 ID ""2 SQLDATETIME 1 24 "#~@~#" 2 CreationTime ""3 SQLDATETIME 1 24 "#~@~#" 3 ModificationTime ""4 SQLDATETIME 1 24 "#~@~#" 4 DiscoveryTime ""5 SQLNCHAR 2 510 "#~@~#" 5 Name_SQL_Latin1_General_CP1_CI_AS6 SQLINT 1 12 "#~@~#" 6 Class ""7 SQLINT 1 12 "#~@~#" 7 Subclass ""8 SQLINT 1 12 "#~@~#" 8 Capabilities ""9 SQLINT 1 12 "#~@~#" 9 SnapshotID ""10 SQLNCHAR 2 510 "#~@~#" 10 CompanyNameSQL_Latin1_General_CP1_CI_AS11 SQLNCHAR 2 510 "#~@~#" 11 TargetRCCountrySQL_Latin1_General_CP1_CI_AS12 SQLINT 1 12 "#~@~#" 12 LocationID ""13 SQLBINARY 1 33 "#~@~#
"13 MirrorID """#~@~#" is the field terminator. We have a lot of text files with allkind of charachers in it. So we think this is a set that will neveroccur in our files.Thanks for your help!*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
Are there any MDS Descriptor Files associated with Databases Files? Are these associated with transaction logs? As per my information there are .mdf , .ndf and .ldf files.Could anyone please guide me ASAP,as there is an urgent requirement from one of the clients?
I've created a linked server (and set up the corresponding schema.ini file) in order to perform bulk-inserts from some CSV text files into SQL tables (from my standpoint the text files are just for reading purposes). The linked server works fine (I can select the data in the files without a problem).
Now the question: is possible to automatically detect when one or more of those files change in order to start the import process automatically? Something like having a trigger created on the CSV files Or there's no easy way to do that so I have, to say something, to create a Job that periodically checks if the files have changed programatically (say, recording each file's timestamp everytime is imported and comparing the recorded value with the current one, or whatever)?
Hi,I guess it is a very simple question but I really can't think of an answer,If I'm using csv files as a database what disadvantages may I have ?I encrypt the data so it is secured and I found a csv reader that reads the files very fast.Why would I use SQL instead ?Thanks,Alex.
Hello, I am working with a database that has 4 files, 2 with weird extentions and one with .NDF and the other with .ldf. Can I combine the files when I restore so that I have just two files like the rest of the data files on my system. I am no database expert, need some help. Thanks.
Am i right in saying that Log Files can only be saved onto the physical disk of the server? I think i am. In that case has anyone got a good way of scheduling a Job that removes the Log Files from the local server and puts them onto another server?
Firstly I should mention that I'm new to the SQL Server arena. I'm using SQL server 7.0 with SP1. I have a DTS problem whereby I'm trying to import a DIF file and I can't find anyway to do it. I've got over it at the mo rather cruedly by importing to excel and exporting as an excel file. However this limits me to 66K rows which I will need to exceed. I can't find any information anywhere which tells me if I can do this, and I can find no odbc drivers to help. I'm sure its possible but the file type seems so wierd compared to a tab delimited one! Hope someone can help. Thanks, Dave.
Hi: I am new to this forum..Does anyone know of a quick and easy way to combine MDF and NDF files for the SAME database??...Or does the entire database have to be exported and then imported into database with only an MDF file available??..thanx in advance for the help !!!
I'm new to the world of SQL. I have a Db that has a size of around 4Gb and a log file of 40 Gb. I've got approx 5 Gb of disk left. How can I reduce the log file size??
Hi there! I hope somebody out there can lend a helping hand on this challenge I`ve been struggling with. I am new to this site so please excuse my ignorance on protocol. Anyway, I`m faced with a challenge of loading a 100 plus column, comma delimited file into SQL Server. I had this process working at one point, but for whatever reason, am receiving "memory allocation" errors related to my BCP process. What I`m doing is trying to load a comma delimited file into SQL Server. I`ve "creatively" created a format file that loads the data into their respective columns and "tricks" the comma quote comma delimiters in dummy fields (associate a 0 server column). So now I`ve got a format file with about 200 Host file count (column count). The problem is is that I run my bcp via the command line, my Dr. Watson kicks in and I get the Window`s NT standard "Application Error" stating that I do not have enough memory allocated resources. What is strange however is that if I change my column count to say 150, my bcp works (minus that last 25 fields). Not that I`m picky or anything, but my users insist they need the last 25 fields. I`ve tried everything I could think of from upping my virtual memory threshold, to running on different workstations (including the SQL Server). Any advice would be greatly appreciated.
I have 4 .dat files that I need to combind into either one table or file. I can't seem to get this process to work. The only delimiter is a comma between each row.
I have not got a clue how to do this. When I use the import wizard it does not work.
Does anyone know how I can get these files inserted into a SQL table??
Thanks for your help. I seem to always get a solution from you guys.