Insert PowerPoint File Into SQL Server Table
Nov 25, 2007Can someone tell me how to insert PowerPoint file into SQL Server database?
Thanks,
Peter
Can someone tell me how to insert PowerPoint file into SQL Server database?
Thanks,
Peter
Dear all,
I am doing a project where I heed to extract data from a table in a sql server and then automatically create a bar chart with this data preferrably in PowerPoint (or excel). I was thinking myself that it may be done in vba/macro added to a button in powerpoint. If anyone knows how to do this i would be very grateful or if anyone know a useful website could they post it up.
Thanks for your help
Laura
I have been tasksed to create a data table and stored procedure to extract a special formatted XML file that is an attachment with a standard XML envelope. The XML file is an attchment in a node within the XML wrapper. There are other MIME files (pdf's ) that are handle by a seperate procedure. But I need to just extract the XML file attached along with those and put it into the datable with some other PK?FK fields.
Is a blob the best datatype. How to I insert that XML file into it?
Hai Everbody,
for me in my project i want to read data from a csv file and insert it in a sql server databse table.The csv file may contain n number of columns,but i want only certain columns from that, and insert it in the database table.How to achieve this. Plz help me it is urgent. Thanks in advance.
Thanks and regards
Biju.S.G
Overall goal: Write a Bulk Insert statement using the UNC path of a filetable directory.
Issue: When using the UNC path of the filetable directory in a Bulk Insert Statement, receiving "Operating system error code 50(The request is not supported.)" Looking for confirmation as to whether this is truly not supported.
Environment: SQL Server 2012 Standard. Windows Server 2008 R2 Standard
Can we bulk insert only the desired column from a flat file to a table?
I am using SSIS to bulk insert from a file with more than 200 columns. I am trying to find a way I can bulk insert them to multiples table through SSIS.
The one way I can think is pre map the columns from the file to the destination tables. Build numerous Bulk Insert tasks to achieve that. But not sure if SSIS will let me do that.
Hi,
i have a file which consists data as below,
3
123||
456||
789||
Iam reading file using bulk insert and iam inserting these phone numbers into table having one column as below.
BULK INSERT TABLE_NAME FROM 'FILE_PATH'
WITH (KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '||')
but i want to insert the data into table having two columns. if iam trying to insert the data into table having two columns its not inserting.
can anyone help me how to do this?
Thanks,
-Badri
I am unable to load data from flat file to sql table using bulk insert sql statement
My code:-
DECLARE @filePath VARCHAR(200)
DECLARE @sql VARCHAR(8000)
Declare @filename varchar(100)
set @filename='CCNVZ_150401054418'
SET @filePath = 'I:IncomingFiles'+@FileName+'.txt'
[Code] .....
Hello all!
I'm trying to see if there is a way to embed SSRS reports into a Powerpoint Presentation (or build a PP Presentation using multiple SSRS reports as the slides)? I have a requirement to have a presentation available whose content is dynamic based upon the customer our marketing folks are going to visit that particular day. The idea is they would access this presentation from IIS or a shared folder
Still new to SSRS (I'm a convert from Cognos / Hyperion), so if this is a real simple question go easy on me! ;-)
Thanks in advance!
I have the following query in sql 2005:
PROCEDURE [dbo].[uspInsert_Blob] (
@fName varchar(60),
@fType char(5),
@fID numeric(18, 0),
@bID char(3),
@fPath nvarchar(60)
)
as
DECLARE @QUERY VARCHAR(2000)
SET @QUERY = "INSERT INTO tblDocTable(FileName, FileType, ImportExportID, BuildingID, Document)
SELECT '"+@fName+"' AS FileName, '"+@fType+"' AS FileType, " + cast(@fID as nvarchar(18)) + " as ImportExportID, '"+@bID+"' AS BuildingID, * FROM OPENROWSET( BULK '" +@fPath+"' ,SINGLE_BLOB)
AS Document"
EXEC (@QUERY)
This puts some values including a pdf or .doc file into a table, tblDocTable.
Is it possible to change this so that I can get the values from a table rather than as parameters. The Query would be in the form of: insert into tblDocTable (a, b, c, d) select a,b,c,d from tblimportExport.
tblImportExport has the path for the document (DocPath) so I would subsitute that field, ie. DocPath, for the @fPath variable.
Otherwise I can see only doing a Fetch next from tblIportExport where I would put every field into a variable and then run this exec query on these. Thus looping thru every row in tblImportExport.
Any ideas how to do this?
I'm trying to insert a blob (*.ICO) in a SQL Server table ?
How can I do this ?
Is that possible to load files (*.bmp, *.jpg etc) to table (field type IMAGE) using BULK INSERT?
Or is it better to do it otherwise?
Thanks
HI all,
I am very new to db. so pardon me for some stupid questions.
I have 3 tables Table1, Table2, and table3.
Table1 and table2 has 5 columns each and table3 has 4 columns.
i have 6 or 7 csv files (these number of files can change) uploaded to a upload directory.
these csv filename contains name of the table. for example table1_ab.csv, table1_cd.csv, table2_ef.csv, table2_gh.csv, table3.aa.csv, table3_bb.csv.
a person goes to a web page and clicks a button.
This is what the button should do.
1) Check how many files are there in the upload directory.
2) use bulk insert to individually insert each csv file into their respective table (the name of the table is in file name).
3) Copy the csv file into backup directory and delete the file from upload directory.
I am using vb.net. any help would me much appriciated.
Thanks
Hi. my websever mysql manager allows to Insert data from a text file into the table.
currently i have a table with the following fields.
email name country
so i wanto insers data from a text file into this table.
so my question is : how the text file format should be prepared.
i.e. the first record would be > xxxx@yahoo.com john us
Hi Guys,
I hope this is easy stuff for you..
First of all i searched the forum but didnt exactly find what iam searching for:
I have a File folder which contains 1..n Files of the same type.
The files contain a DateValue at the beginning of each row. I now want to read the first Record of the file - extract the datevalue and search in my Importtable if there are any records with that Date. If there are allready Records with this date i know i allready imported that file and skip it in my For each File Container. If no records where found I want to copy them from the file to the table.
So I have a flatfile source and thoght I just make an oledb command task afterwards which looks like "Select count(*) from Import where Processdate = ?. and then a conditional split if the count == 0 or not... but i have problems getting the Count value out auf the OLD DB Command Task because everytime I try to add an outputcolumn i get the message: "An Output cannot be added to the output collection" and since there is no possibility to map an expression to the result...
I tried to workaround the problem using a lookup task.. but that seems to be the wrong way.
thanks for your help
bye
AS
Hi,
Here is the steps I should take...
1- Check for the log table and find run status ( there is a date field which tells the day run)
2- Lets say last day was 2008-05-15, So I have to check A1.DDDDMMYYY file exists in the folder for each
day like A1.20080516,A1.20080517 and A1.20060518 ( until today)
3- if A1.20080516 text file exist then I have to move it to the table and same thing for other dates
like if A1.20080517 exists I have to load it to table and so on
it looks like for each loop, first I have to get the last date and then I have to check the file exists for each date and
if the date file exists then I have to load it into table...
Please tell me How can I do it. it looks complex looping...
thanks,
J
Hi
i need to inser the .dbf file records into sql table.
How should i go about it?
regards,
Kiran
Hey all
I have a bulk insert situation that would be nice to be able to pull off. I have a flat file with 46 columns that are to go into a table. The table, I want to have a 47th column to be updated later on by means of a stored proc saying if the import into the system was sucessful or not. I have the rowterminator set as '"' thinking that would tell SQL to begin on the next row, leaving the importstatus column null but i still receive an error.
First of all, is this idea possible within this insert statement. Secondly, if so, what would be the syntax to tell the insert statement to skip that particular column. It is the last column listed in the table so it just needs to start on the next row after it inserts the last bit of data in the flatfile.
If this is not possible, is it possible to bulk insert into a temp table?
Thanks
I am running a set of SQL statements on a SQL server, to insert flat file data into a SQL table. The flat file is already FTP'ed to the SQL server. I seem to be getting an error, which is possibly pointing to a permissions issue
The statements:
BULK INSERT [Jedox_prod].[dbo].[B_BP_Customer]
FROM 'c:jedox_dailyjdcom4401.txt'
WITH
(
FIRSTROW = 2,
MAXERRORS = 0,
FIELDTERMINATOR = '|',
ROWTERMINATOR = '
'
)
GO
The error is :
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "c:jedox_dailyjdcom4401.txt" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 1815)
If it is permissions issue, how do I overcome this?
hi my friends
i want read data from text file,and write to my table in sql,
please help me
M.O.H.I.N
Ok, I'm not quite sure how to approach this one. This is a VB.NET console app in which I want to capture each row and throw it into a table. The reason being, they want a report on what was processed...which I'll be able to do easily in Reporting Services 2005 once this crap is in a table where it should be.
1) What should I use to do this, dataset? I want to use stored procedures also, not inline SQL
Function here takes an incoming file, and splits it up into separate files. I want to insert each row that is succesfully split
Public Sub ProcessFiles(ByVal sIncomingfile As String, ByVal sOutputDirectory As String)
If sIncomingfile <> "" And sOutputDirectory <> "" Then
Dim f As New Security.Permissions.FileIOPermission(Security.Permissions.PermissionState.None) f.AllLocalFiles = Security.Permissions.FileIOPermissionAccess.Read
Dim file As New IO.FileInfo(sIncomingfile) Dim filefs As IO.FileStream = Nothing If file.Exists Then Try filefs = New IO.FileStream(file.FullName, IO.FileMode.Open) 'Place: 1 Catch ex As Exception SendEmail("Incoming .mnt or .naf Filename Invalid or not found", "Place: 1") Application.Exit() End Try End If
Dim reader As New IO.StreamReader(filefs) Dim counter As Integer = 0
Dim CurrentFS As IO.FileStream Dim CurrentWriter As IO.StreamWriter Dim extension As String = IO.Path.GetExtension(file.FullName)
If extension = ".mnt" Then While Not reader.Peek < 0 Dim Line As String = reader.ReadLine If IsNumeric(Line.Substring(0, 1)) Then Dim Parts() As String = Line.Split(" "c) ' split row into parts If Parts(0).Length = 8 Then ' if first part is 8 then know we hit another header so cut and then write to file counter += 1 If Not CurrentWriter Is Nothing Then CurrentWriter.Flush() : CurrentWriter.Close() CurrentFS = New IO.FileStream(IO.Path.Combine(IO.Path.GetDirectoryName(sOutputDirectory), Line.Substring(59, 4) & "[" & counter.ToString & "]" & Now.ToString("MM-dd-yyyy") & IO.Path.GetExtension(file.FullName)), IO.FileMode.Create) CurrentWriter = New IO.StreamWriter(CurrentFS) End If
If Not CurrentWriter Is Nothing Then CurrentWriter.WriteLine(Line) End If
End If End While
If Not CurrentWriter Is Nothing Then CurrentWriter.Flush() : CurrentWriter.Close()
MoveFilesFTP(sOutputDirectory, "mnt")
ElseIf extension = ".naf" Then While Not reader.Peek < 0 Dim Line As String = reader.ReadLine If Not IsNumeric(Line.Substring(0, 1)) Then ' if first part is not a number, then we know it's a header so split the file counter += 1 If Not CurrentWriter Is Nothing Then CurrentWriter.Flush() : CurrentWriter.Close() CurrentFS = New IO.FileStream(IO.Path.Combine(IO.Path.GetDirectoryName(sOutputDirectory), Line.Substring(6, 4) & "[" & counter.ToString & "]" & Now.ToString("MM-dd-yyyy") & IO.Path.GetExtension(file.FullName)), IO.FileMode.Create) CurrentWriter = New IO.StreamWriter(CurrentFS) End If
If Not CurrentWriter Is Nothing Then CurrentWriter.WriteLine(Line) End If
End While
If Not CurrentWriter Is Nothing Then CurrentWriter.Flush() : CurrentWriter.Close()
MoveFilesFTP(sOutputDirectory, "naf") End If Else 'input file not valid SendEmail("Incoming .mnt or .naf Filename Invalid", "Place: 1") End If End Sub
How do I do a bulk insert into a temp table from a text file. Text file looks like that:
ver_id TYPE
E57AB326-803C-436E-B491-398A255C919A 58
34D2A601-6BBA-46B1-84E1-3A805FDA3812 58
986E140C-62F1-48F1-B428-3571EBF00DA0 58
My statement looks like this:
CREATE TABLE [dbo].[tblTemp]([ver_id] [nvarchar](255), [TYPE] [smallint])
GO
BULK INSERT [dbo].[tblTemp]
FROM 'C:v2.txt'
I keep receiving errors.
The error I receive is: Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (TYPE).
I have an Excel file with .csv extension . it has on sheet with name Sheet1.
Now, I'm trying to insert this Excel data into one #temp table. I tried with syntax:
----------------
Exec sp_configure 'show advanced options', 1;
RECONFIGURE;
GO
Exec sp_configure 'Ad Hoc Distributed Queries', 1;
RECONFIGURE;
GO
EXEC master.dbo.sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.12.0' , N'AllowInProcess' , 1;
[Code] ...
But, I'm getting error:
The OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)" reported an error. The provider did not give any information about the error.
Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)".
If I'm executing for .xls file this statement is working finr and rows are inserted into #temp table. How to take excel file of .csv extension??
I'm inserting from TempAccrual to VacationAccrual . It works nicely, however if I run this script again it will insert the same values again in VacationAccrual. How do I block that? IF there is a small change in one of the column in TempAccrual then allow insert. Here is my query
INSERT INTO vacationaccrual
(empno,
accrued_vacation,
accrued_sick_effective_date,
accrued_sick,
import_date)
[Code] ....
Hello Guys,
I am working in one company and currently I am assigned to new project for Data Migration from company X to our company Y using SSIS. I am totally new and i just completed 5 tutorial which was gien on MSDN website.
Basically client is going to send us first flat file with 1 million records with Header, Detail and Trailer records.
I want to create a Package in such a way that it dumps all this first load into 7 to 8 different tables at a time.
we also have to include functionlity for validation and error check.
On successfull load error file should only return Header and Trailer but no detail records.
If there are any errors then error file should contain Header, Detail records which were unable to load plus trailer which we have to sent back to client.
When 2nd file comes that time we have to check whether this is new records or change (update) one depending on Flag which tells it.
This is basically high level idea of my Package what i need to create. If u guys have any question then let me know.
I know you guys are very experienced one. Anyone of you please give me some detail idea on it I would really appricate it.
I have very limited time line for it.
Thanks
Shah
Hi,
I'm new to SQL Server 2005 SSIS. I'm trying to do something very simple, but I cannot figure it out, PLEASE HELP!
I have a flat file, which I read and then insert the data in a database table, that works fine. The problem is that I don't want to insert duplicate records. For example; if I run the package again, it will appent to the table. What I need to do is that if the package runs again, check if the record already exist, based one two columns, date and hour, and do not insert the record.
Thank you,
Aldo
Hi ,
i was used the Follwing DataFlow for my Package.using Oracle 8i
FalteFile Source -------------> Data Conversion --------------->OLEDB Destination (Oracle Data table)
using above control flow to map the Source file to Destination . When i run the SSIS Package teh Folwing Error i got
"Truncation Occur maydue to inserting data from data flow column "columnName " with a length of 50 "
regarding this Error i i understood its for happening Data Length . so that i was changed the Source Column Length Exactly Match with the The Destination table.
still i am getting this Error. pls any one give me a solution . SHould i Change the DataType also?
pls give your suggestion
Thanks & Regards
Jeyakumar.M
chennai
I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file.
Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns in it. These files needs to be migrated to SQL tables using the optimum way.
Which is the best way to setup the Dataflow task for this requirement?Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.
we can easily load a file into db tables. However, my main concern here is the number of columns in the file. A text file TEXT_1400.txt has 1400 columns. I am unable to load data to my db table using BCP or BULK INSERT commands, as maximum of 1024 columns are allowed per table in SQL Server 2008.
We can still go ahead and create ‘Wide Table’ (a special table that holds up to 30,000 columns. The maximum size of a wide table row is 8,019 bytes.). But when operating on wide table, BCP/BULK INSERT commands still fail. After few hours of scratching my head over BCP and BULK INSERT, I observed that while inserting BCP/BULK INSERT commands are unable to identify SPARSE columns and skip these columns, which disturbs column mapping and results in data conversion and trancation errors.
Is there any proper way to load this kind of files into the db table?
HiI have a task that I have to get it done. I have to create a web page that enable user to upload file (pdf, doc, txt). I want to have this file to be inserted into a database. So, I would like to know how to get this part work - uploading file and insert them into a database. So far, I have FileUploader in the form and how do I go from thereThanks
View 3 Replies View RelatedI have lot data in XML files, I want to put them into the SQL server, Can anybody give suggession , How to insert those xml files into the SQL Server.
View 1 Replies View RelatedI want Compare two Table data and insert changed field to the third table ...
View 9 Replies View Related
Hi... I was hoping if someone could share me some thoughts with the issue that I am having at the moment.
Problem: When I run the package in my local machine and update local SS DB/table - new records writes OK in the table. BUT when I changed my destination meaning write record into another physical SS DB/table there is no INSERT data occurs. AND SO when I move/copy over that same package into another server (e.g. server that do not write record earlier) and run it locally IT WORKS fine too.
What I am trying to do is very simple - Add new records in a SS table using SSIS . I only care for new rows and not even changed rows.
Here is my logic -
1. Create Ole DB source to RemoteSERVER - using SELECT stmt
2. I have LoopUp component that will look for NEW records - Directs all rows that don't find match and redirect rows (error output).
3. Since I don't care for any rows that is matched in my lookup - I do nothing or I trash the rows
4. I send the error rows (NEW rows) into OleDB destination
RESULTS when I run the package locally and destination table is also local - WORKS FINE;
But when I run the package locally and destination table is in another Sserver (remote) - now rows is written.
The package is run thru BIDS manually so there is no sucurity restrictions attached to it.
I am not sure what I am missing. And I do not see error in my package either. It is not failing.
Thanks in advance!