What Table Contains The Name Of The Log Files?
Apr 5, 2000I know sysdatabases has the data bases, but where are the names of the log files contained?
View 3 RepliesI know sysdatabases has the data bases, but where are the names of the log files contained?
View 3 RepliesHello!, I have two applications that import data from text files to Access 97 tables. One of this applications is in Access 97 and the other in VB6, know they are requesting me to import this text information to a SQL table. I do not have any experience with SQL Server, can some one give a idea of what I could do to accomplish this?, I would appreciate a lot anyone's help!, thanks! ...
View 4 Replies View Relatedhello everyone... i need ur help... how can i back up only a certain table that i want using sql server management studio?? or mbe some commands...???
View 5 Replies View Relatedhello everyone... i need ur help... how can i back up only a certain table that i want using sql server management studio?? or mbe some commands...???
View 10 Replies View RelatedHi:
I am trying to set up a TEST ENIVORNMENT for a reservation software package. I need this setup so that I can run various scheduling scenarios in order to optimize operations. Below I have been given instructions from the software vendor on how to set up my SQL database. However, I am a little confused on what to do for a couple of the steps. They are as follows:
***In SQL server 2000 enterprise manager on the test laptop, change the 2 path settings in the table SGCONFIG. The should be changed to C:StratagenAdept5Server. ** YOU WILL NEED TO DO THIS STEP EVERYTIME YOU RESTORE A BACKUP ADEPT5_CLASTRAN.BAK FROM THE PRODUCTION TO THE TEST ENVIRONMENT
QUESTION: How do you change the path settings on a table?
***Open the .udl files in the apps folder and the server apps folder and check to see that they are pointing to the test laptop server, not the production server.
QUESTION: What are the .udl files and how do you check to see if they are pointing at the test laptop server only?
Thanks sincerely for your help. I am trying to meet a deadline for a meeting tomorrow. Therefore, I am desperate. Please send me email. rtanner@clastran.com.
Ron
Hi:
I am trying to set up a TEST ENIVORNMENT for a reservation software package. I need this setup so that I can run various scheduling scenarios in order to optimize operations. Below I have been given instructions from the software vendor on how to set up my SQL database. However, I am a little confused on what to do for a couple of the steps. They are as follows:
***In SQL server 2000 enterprise manager on the test laptop, change the 2 path settings in the table SGCONFIG. The should be changed to C:StratagenAdept5Server. ** YOU WILL NEED TO DO THIS STEP EVERYTIME YOU RESTORE A BACKUP ADEPT5_CLASTRAN.BAK FROM THE PRODUCTION TO THE TEST ENVIRONMENT
QUESTION: How do you change the path settings on a table?
***Open the .udl files in the apps folder and the server apps folder and check to see that they are pointing to the test laptop server, not the production server.
QUESTION: What are the .udl files and how do you check to see if they are pointing at the test laptop server only?
Thanks sincerely for your help. I am trying to meet a deadline for a meeting tomorrow. Therefore, I am desperate. Please send me email. rtanner@clastran.com.
Ron
Hi all,I have de following application to do :I receive several .csv files from another application in a determined folderof my PC.Those files are named with the format log1.csv logs2.csv logs...The number of file is variable but the internal format is always : time_sec;levelSo the files content a field that may be used as unique key in the target database.I'm trying to build a DTS package that should import periodicallyall the CSV's present in the folder and then destroy them if donesuccessfully.Apparently its not so simple than I supposed. I have always to give the nameof the table I want to import.any idea?
View 1 Replies View RelatedHi. I am using DBF files as sources for some tables in SQL Server. The problem is that the table names in the DBF files are not all the same (e.g. frame061, frame949). I want to know the names of the tables inside the DBF files. Is there a way to query the table names, something like "select table_name from information_schema.tables" in SQL Server? By the way, those DBF files came from FoxPro. Thanks!
View 4 Replies View RelatedDear All,
I am importing all the files from a particular folder to a table on my database KB. It is working perfectly if i use it on the same system where the DB exists and not working from the network.
USE TESTDB
--Table Creation Starts here
Create table Account([ID] int IDENTITY PRIMARY KEY, Name Varchar(100),
AccountNo varchar(100), Balance money)
Create table logtable (id int identity(1,1),
Query varchar(1000),
Importeddate datetime default getdate())
--Table Creation ends here
---Stored Procedure Starts here
Create procedure usp_ImportMultipleFiles @filepath varchar(500),
@pattern varchar(100), @TableName varchar(128)
as
set quoted_identifier off
declare @query varchar(1000)
declare @max1 int
declare @count1 int
Declare @filename varchar(100)
set @count1 =0
create table #x (name varchar(200))
set @query ='master.dbo.xp_cmdshell "dir '+@filepath+@pattern +' /b"'
insert #x exec (@query)
delete from #x where name is NULL
select identity(int,1,1) as ID, name into #y from #x
drop table #x
set @max1 = (select max(ID) from #y)
--print @max1
--print @count1
While @count1 <= @max1
begin
set @count1=@count1+1
set @filename = (select name from #y where [id] = @count1)
set @query ='BULK INSERT '+ @Tablename + ' FROM "'+ @Filepath+@Filename+'"
WITH ( FIELDTERMINATOR = ",",ROWTERMINATOR = "")'
--print @query
exec (@query)
insert into logtable (query) select @query
end
drop table #y
--sp ends here
Exec usp_ImportMultipleFiles 'c:myimport', '*.csv', 'Account'
If i use the above Exec like
Exec usp_ImportMultipleFiles '\kb-02C$MyImport', '*.csv', 'Account'
I am getting the following error:
Could not bulk insert because file '\kb-02C$MyImportAccess is denied.' could not be opened.
Operating system error code 5(Access is denied.).
C Drive and MyImport folder is shared on system kb-02
Would appreciate your valuable HELP.
thanking your valuable help in advance.
K006B
Hi,
I am trying to use linked tables to connect SQL Server 2000 to a legacy system using dbase III files. (I need real time read only access to these files)
I have created a linked table from SQL server to a folder on the C drive which contains the dbase III files, using an ODBC DSN which uses "Microsoft dbase driver (*.dbf)". DSN tested successfully using Excel. Linked server connection is then created using "Microsoft OLE DB Provider for ODBC drivers".
The dbase tables appear OK in Enterprise manager, but I cannot get a query to work in SQL Anayzer, using the 4-part name syntax.
My query is just :
SELECT * FROM LinkedTable...customers
Error message is "Invalid schema or catalog specified for provider 'MSDASQL'". Now I am pretty sure dbase files do not support any sort of schema / catalog set up, so I suspect SQL Server is looking for something it is not going to get.
One clue might be that in Enterprise manager, under the catalog column, I get the pathname to the dbase file, ie c:customers.dbf, which I cannot enter in the 4-part syntax.
Any suggestion welcome !!!
We have a project to parse out an xml file into relational sql table. The xml file is complex type with multiple nesting. We are trying to resort to use XQuery to parse it out to SQL tables-because of one thing or the other - other options on the table were not viable. I know that we can use C# to do the same thing but we are sticking to TSQL with Xquery. Has anybody used the same route for processing large complex xml files?
View 1 Replies View RelatedHi,
View 10 Replies View RelatedHow do I automate importing "All Text Flat Files" into a SQL 7 table. The key is that there is no validation neccessary for the data and I do not want to manually import the data. I just to delimited the data and import it using either a script or a schedular of some type that can do it for me. Some Please Help
View 1 Replies View RelatedHi all,
I got a situation here.....
From a source table (in SERVER1) I get ids of candidates and from another source (in SERVER2) I get their CVs (text files stored in various Folders). My destination table (in SERVER3) has two fields, CandidateId & CandidateCV.
I have to transfer the data in above fashion for nearly 1 million records.
How can I write a DTS package which picks up the text file from SERVER2 based on the CandidateId which comes from SERVER1? Probably I need some kind of looping mechanism which changes the candidate id & his CV file.
Can anyone help???
Thanks...
I want to query a column of xml files in a table,
use mysql1
declare @bp xml
select @bp=xml
;WITH XMLNAMESPACES('http://schemas.openehr.org/v1' as bp,'http://www.w3.org/2001/XMLSchema-instance' as xsi,'OBSERVATION' as type)
select * from (
select
m.c.value('(./bp:data/bp:items[1]/bp:value[1]/bp:magnitude)[1]','int') as systolisch
from
BloodpressureMitSchema cross apply
@bp.nodes('/bp:content/bp:data/bp:events') as m(c))m
But with this "cross apply" I can only query all the values in one xml and repeat them. Is there something wrong at "declear"
I have over 600+ Excel .xlsx file that I have been trying to import to Sql database table. I've been trying to complete this task with SSIS but no luck yet. I have seen several videos and read articles but when I run the package the source is validated but I always get an error in the destination. I am using Excel 2010 and SQL Server 2012.
View 3 Replies View RelatedHi,
I am facing issue with the auto fit width. When i am creating a report which includes table. The table column length should get adjusted to the text size displayed in it instead of displaying the text in 2 lines. But i dont find any way to set that option. Could anyone let me know how to set the column length as per the text displayed in the column in table.
Thanks
I am learning SQLServer Integration Services.
I created a file People.txt containing firstName, LastName seperated by a pipe.
------------------content-----------
John | Doe
Mike | James
Adam | Smith
-----------------------------------------
and another one called gender.txt
------------------content-----------
M
---------------------------------------
I will would like to create integration services package that compines each record of the first file with the record of the second file and inserts the result into table.
--------------Result table content------------------
John
Doe
M
Mike
James
M
Adam
Smith
M
-----------------------------------------------------------
Thanks
I have more than 500 CSV files with a similar structure [Same column name and same data format]. I would like to load these files in a database table on the SQL Server 2014 database.
View 21 Replies View RelatedHi,
i have a scenario where I have to read 2 files that update the same table (a temp staging table)...this comes from the source system's limitation on the amount of columns that it can export. What we have done as a workaround is we split the data into 2 files where the 2nd file would contain the first file's primary key so we can know on which record to do an update...
Here is my problem...
The table that needs to be updated contains 9 columns. File one contains 5 of them and file2 contains 4 of them.
File 1 inserts 100 rows and leaves the other 4 columns as nulls and ready for file 2 to do an update into them.
File 2 inserts 10 rows but fails on 90 rows due to incorrect data.
Thus only 10 rows are successfully updated and ready to be processed but 90 are incorrect. I want to still do processing on the existing 10 but cant affort to try and do processing on the broken ones...
The easy solution would be to remove the incorrect rows from the temp table when ever an error occurs on the 2nd file's package by running a sql query on the table using the primary keys that exist in both files but when the error occurs on the Flat File source, I can't get the primary key.
What would be the best suggestion? Should i rather fail the whole package if 1 row bombs out? I cant put any logic in the following package that does the master file update/insert from the temp table because of the nature of the date. I
Regards
Mike
How to import multiple text files (residing in single folder) into SQL Server table? I know how to import single file but not sure how multiple files could be loaded? Pls. guide.
Thanks,
HShah
Hi,
I have a package that reads a fixed width flat file data into a single CHAR column in an sql server 2005 table.
In the "Columns" tab of the Flat File Connection Manager, I set the RowDelimiter value to {LF} and nothing for the ColumnDelimiter (since I read the entire row from the flat file into a single column in my sql table).
However, in the "Advanced" tab, the ColumnDelimiter Misc property shows {LF}. This was working fine for me.
The problem I was facing was with some files which were recently identified to have rows containing a special character (probably ASCII ZERO) in the middle of the row. So, now if the record was having 400 characters and the 200 th character had this special character, the package was writing the first 200 characters into the sql table and ignoring the rest of the characters.
I am sure that the special character was ASCII ZERO - I wrote a script to read each character in the line and find the ascii code for it.
Has anyone faced this problem ever. If so, pls let me know your solution or any ideas that can help sort this problem. Your help would be much appreciated.
Thanks!!
Hi,
I have to export the table data from my databse into text files as I nedd to put it in Informix database using a sheel script. Is there a way by which I can do this.
Is there any other way by which I can put the data from SQL Server to Informix.
Any takers,
Thanking you in advance.
Bye for now,
Himauhu
I wanted to know that when I create a table using SQL Server, does the table metadata like schemaname and catalogname is stored in memory or in hard-disc. I am trying to extract schemaname and catalogname when the database is down or correctly when I have no connection string. I know about sysindexes, systables etc, but i am not able to understand whether it stores the data which is useful to me like catalogname.
sql query to extract the column names from sysindexes?
I need to import data to a MSSql table from massive (read: a million and a half rows, every single day) logs that come in .txt format separated in tabs with a ";" symbol and then have some stored procedures analyze that data to generate some reports in an excel file with that info. The text files include the column headers in the first row and the data starts on the second one.
The challenge is that the text files differ in column order and count every single day.
The analysis that I need to do only needs about 15 columns from the nearly 90-120 that those files include, and those columns sadly happen to be in a different order in those files.
I am having with trying to import XLSX files into SQL 2012 64 Bit.
I have installed the Access driver (AccessDatabaseEngine_x64.exe)
I have configured the script to run the following SP
sp_configure 'show advanced options', 1
GO
RECONFIGURE WITH OverRide
GO
sp_configure 'Ad Hoc Distributed Queries', 1
[Code] ....
So I first create my Temp Table
The run the SP above then I run the insert into the Temp table defined
INSERT INTO tempdb.dbo.TempTRBZ (IsNew,CoID, Zip, City, County,StateCode,Rate,Taxable,TaxShip,TaxLab,CountryID,StateID)
SELECT * FROM OPENROWSET( 'Microsoft.ACE.OLEDB.12.0','EXCEL 12.0;Database=C:TempNotInTrbzJan.xlsx;HDR=YES','SELECT * FROM [Data$]')
[Code] ....
The error message I get back is
Msg 7303, Level 16, State 1, Line 4
Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)".
What I have set wrong on the import? Using SSIS at this point is not a real option.
I have around 100 XL Files in a folder ,i want to import all the files dynamically and load all the data in a single table in sql server 2008. Without using SSIS i want to query using openrowset.
View 11 Replies View RelatedI have a master table containing details of over 800000 surveys made up of approximately 400 distinct document names and versions. Each document can have as few as 10 questions but as many as 150. Each question represents one row.
My challenge is to create a separate spreadsheet for each of the 400 distinct document names and versions containing all the rows and columns present in the master table. The largest number of rows would be around 150 and therefore each spreadsheet will not be very big.
e.g. in my sample data below, i will need to create individual Excel files named as follows . . .
"Document1Version1.xlsx" containing all the column names and 6 rows for the 6 questions relating to Document 1 version 1
"Document1Version2.xlsx" containing all the column names and 8 rows for the 8 questions relating to Document 1 version 2
"Document2Version1.xlsx" containing all the column names and 4 rows for the 4 questions relating to Document 2 version 1
I assume that one of the first things is to create a lookup of the distinct document names and versions assign some variables and then use this lookup to loop through and sequentially filter the master table data ready for creating the individual Excel files.
--CREATE TEMP TABLE FOR EXAMPLE
IF OBJECT_ID('tempdb..#excelTest') IS NOT NULL DROP TABLE #excelTest
CREATE TABLE #excelTest (
[rowID] [nvarchar](10) NULL,
[docName] [nvarchar](50) NULL,
[Code] .....
--Output
rowIDdocNamedocVersionquestionblankField
1document11q1NULL
2document11q2NULL
3document11q3NULL
4document11q4NULL
5document11q5NULL
6document11q6NULL
[Code] .....
Im trying to upload 1000 txt files into one table in SQL. I'm using the following query, to upload one txt file at a time:
bulk insert [dbo].AAA_2013_2015
from 'dataserverSQL Data FilesSQL_EMELIZFC x Bloque Detallada201308 Detalle FacturasFACT_BLOQ_AGO13 (4).txt'
with (firstrow = 2,
lastrow = ???,
fieldterminator = ';',
rowterminator = '0x0A')
I'm trying that the query skip the last row because gives me the following error:
Msg 4866, Level 16, State 1, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 17. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
know a command to skip the last row, something like lastrow= all-1...or something like that.
I also executed using MAXERRORS command...like this:
bulk insert [dbo].AAA_2013_2015
from 'dataserverSQL Data FilesSQL_EMELIZFC x Bloque Detallada201308 Detalle FacturasFACT_BLOQ_AGO13 (15).txt'
with (firstrow = 2,
fieldterminator = ';',
MAXERRORS = max_errors,
rowterminator = '0x0A')
does not recognize MAXERRORS command, also tried to put a number of error instead of max_errors.
I have a requirement to load multiple flat files in target table .
I have created the package which used to load files into target table using For each loop container.
But now requirement has been changed now I have to take only those files from table where status="Success" and max JobId. By the query I am to get those records which need to load into table.
Below query I am using to get the files which need to load.
select [JobLogKey],[SrcNm],[DestNm]
FROM [ConfigRep].[dbo].[JobLog]
Where [JobId]=
(Select Max(cast([JobId] as Int)) Jobid
FROM [ConfigRep].[dbo].[JobLog]
Where [JobStat]='Success')
Output:-
JobLogKey SrcNm DestNm
268 H:Data PlatformSource FileClient2LocHGSSpecLocation.txt Location.txt
269 H:Data PlatformSource FileClient1LocHGSSpecLocation.txt Location.txt
I have to load using above 2 files which are under SrcNm. I have created one variable called FileToLoad as Object and mapping to result set of above query. I have create JobId,SrcNm and DestNm variable to catch the record at every loop. I have created 2 For each Loop container
Below screen shot of outer Foreach loop. Till here Its working fine. Inner for each loop container not executing any task under that. How to get it done.
I am getting the following error when trying to load multiple excel files using for each loop container in SSIS, I tried to put the quotes in several different ways but still can't get rid of this error. I was able to successfully load single excel file, but when I use the for each loop container that's when I am having problems. Any help is greatly appreciated. Thx.
TITLE: Package Validation Error
------------------------------
Package Validation Error
------------------------------
ADDITIONAL INFORMATION:
Error at Package1 [Connection manager "SourceConnectionExcel"]: The connection string components cannot contain unquoted semicolons. If the value must contain a semicolon, enclose the entire value in quotes. This error occurs when values in the connection string contain unquoted semicolons, such as the InitialCatalog property.
Error at Package1: The result of the expression ""Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + @[User::Folder] + @[User::file] + ";Extended Properties="Excel 8.0;HDR=NO";"
" on property "ExcelFilePath" cannot be written to the property. The expression was evaluated, but cannot be set on the property.
(Microsoft.DataTransformationServices.VsIntegration)
------------------------------
BUTTONS:
OK
------------------------------
have an excel file which contains lots of sheets. Some of them are named as DW-<day>-<month> (for e.g; DW-1-July). Like this I have sheets for the whole month. I have other sheets too with a different name. I would like to import data from these sheets only (DW ones). Upon my research I have found that this can be achieved via For Each Loop Container.
View 7 Replies View RelatedI am using Bulk Copy command for Exporting data table wise from database to csv files and it was working fine. Since last 3-4 days when exporting for some tables data in csv file is coming junk.
View 1 Replies View Related