How To Import The Database Data From Backup File ?
Nov 20, 2007
Hello,All:
I have two sql 2000 servers,one for production and the other is for backup server,I used the sql agent to create bakup daily in Sql server Enterprise Manager.
Now I want to import the backup data(generated by production server),I don't know how to do it.
Is it possible to do it ? anyone can give me a soluation ?
Can anyone tell me how can I achieve the file import of XLS file and read the data inside to insert into the respective table in my SDF database in the mobile devices?
i found that database log file can contain more records after performing backup database statement.
for example:
i create a database and limit the log file to 2mb. then i create a table and insert data.
If i backup the database before i insert data , the database file can contain 192 records unitl the log file is full.
If i don't perform the 'backup database' statement. The 'dbcc sqlperf(logspace)' indicate the utilization ratio is less than 40% after inserting 192 records
why?
I list my code:
Code Snippet create database db_test on primary ( name=db_test, filename='C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDatadb_test.mdf' ) log on ( name=db_test_log, filename='C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDatadb_test_log.ldf', maxsize=2mb ) go backup database db_test to disk='db_test.bak' --- if i don't execute this line, log file can contain a lot of record go create table db_test..table1(col char(8000)) --insert data to fill up the database log declare @n int set @n=0 while @n<192 begin insert into db_test..table1 values(replicate('a',8000)) set @n=@n+1 end
question 2: i create a database and limit the log file to 2mb. Then i create a table and insert data in an endless loop.
After the inserting operation executing for a while, the 9002 error occurs, indicate the log file for the database is full. But the 'dbcc sqlperf(logspace)' command indicate the unilization ratio is low, and log_reuse_wait_desc in sys.database is 'CHECKPOINT' And I can insert data , and i'm sure the state of log_use_wait_desc is 'CHECKPOINT'.
As i known, the checkpoint can't truncate log under full recovery model. Only the back log operation can truncate the transaction log. So log is not full, why 9002 error is encounterd. and why the log_reuse_wait_desc return 'CHECKPOINT'?
I list my code:
Code Snippet create database db_test on primary ( name=db_test, filename='C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDatadb_test.mdf' ) log on ( name=db_test_log, filename='C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDatadb_test_log.ldf', maxsize=2mb ) go create table db_test..table1(col char(8000))
--insert data to fill up the database log declare @n int set @n=0 while @n<>-1 begin insert into db_test..table1 values(replicate('a',8000)) end
Using SQL Server 2005 Server Management Studio, I attempted to back up a database, and received this error:
Backup failed: System.Data.SqlClient.SqlError: Backup and file manipulation operations (such as ALTER DATABASE ADD FILE) on a database must be serialized. Reissue the satement after the current backup or file manipulation is completed (Microsoft.SqlServer.Smo)
Program location:
at Microsoft.SqlServer.Management.Smo.Backup.SqlBackup(Server srv) at Microsoft.SqlServer.Management.SqlManagerUI.BackupPropOptions.OnRunNow(Object sender)
Backup Options were set to:
Back up to the existing media set
Overwrite all existing backup sets
I am fairly new to SQL 2005. Can someone help me get past this issue? What other information do I need to provide?
I have a backup of sqlserver 2000 data base . Now presently i am working on sqlserver 2005. Is there any facility to IMPORT entire 2000 database intosqlserver 2005.Please help me if there is any chance. Thanks in advance.
I should restore a SQL Server 2005 Database from backup. The backup contains three files, named user.bak0, user.bak1 and user.bak2.
How is the syntax of the restore filelistonly and the restore database ... ?
I usualy write restore filelistonly from disk = 'path and filenam.bak' restore database. zy from disk = 'path and filename.bak' with replace, move..... move....
This works but I cannot use it with a splitted backup file. The files are much too big to put together to one file.
I'm moving data between identical tables and have to use a flat file as an intermediary. I thought: "No problem, SSIS can do a quick export to a file, then move the file to another server, then use SSIS to import the data to the new server."
Seems simple, right?
I'm hitting all sorts of surprising data conversion errors. I used the export wizard to create the export package. This works fine. However using the same flat file definition, the import package fails -- even when I have no destination. That is I have just one data flow task that contains only one control: the Flat File source. When I run the package the flat file definition fails with data type conversion and truncation errors. One of the obvious errors is for boolean types. The SQL field is a bit, SSIS defined the column as DT_BOOL, the output of the data are literal text values "TRUE" and "FALSE". So SSIS converts a sql datatype of bit to "TRUE" and "FALSE" on export, but can't make the reverse conversion on import?
Does anyone else find this surprising? I would expect that what SSIS exports, it can import given all the same table and flat file definitions. Is SSIS the wrong tool to do such simple bulk copies? I'd like to avoid using BCP because this process will need to run automatically within SQL Agent so we can leverage all the error tracking and system monitoring.
I have one column in SQL Server 2005 of data type VARCHAR(4000).
I have imported sql Server 2005 database data into one mdb file.After importing a data into the mdb file, above column data type converted into the memo type in the Access database.
now when I am trying to import a data from this MS Access File(db1.mdb) into the another SQL Server 2005 database, got the error of Unicode Converting a memo data type conversion in Export/Import data wizard.
Could you please let me know what is the reason?
I know that memo data type does not supported into the SQl Server 2005.
I am with SQL Server 2005 Standard Edition with SP2.
Please help me to understans this issue correctly?
I backup my database through SQL Server 2000 SQL Query Analyzer : "BACKUP DATABASE Mysite to disk = 'd:BatabaseBackupMysite.bak' ", and got a bak file to import it to hoster server. Now I need import (backup) database from that hoster server back to my local server. I got a bak file call "MysiteExport.bak" file and transfer it to my computer d:BatabaseBackupMysiteExport.bak' . Then use SQL Server import wizard to import "MysiteExport.bak" and fail to do it.
How can I import/restore database from this "MysiteExport.bak" which have already under my computer. If use SQL Query Analyzer, what is the correct statement.
I have no previous experience in MS SQL and i have this little problem : i import data into a table from a txt file (log file after process) and i want to put the time/date string into a column with data type timedate, but i cant make it, there is an error message (cant store char data into timedate field...) Any help?? thanx in advance :P
I have a .csv-file which is imported into my database using DTS. When look at the DTS package in design-mode i can see that it creates a table and then copies the data from the file into the created table.
Nothing wrong there.
But when the contents of the file changes (e.g. column added or removed) the create table script raises an error. The script is not dynamic, it is created when the package is created and only changes when the package itself is re-created.
Is there a way to make the create table script be dynamic so that the table is dropped re-created according to the format of the .csv-file?
I want to put every value between double quotes in an exact column of a table (lets say it IMPORT table). The problem is that values in quotes refer to hours. So I'd like to get a result 184,5 hours for "184:30".
Hai all., Im trying to import the CSV file values using bulk insert but im getting an error in my code .so can anyone help me on this. the following is the coding i have created --create table CREATE TABLE CSVTest (ID INT, FirstName VARCHAR(40), LastName VARCHAR(40), BirthDate SMALLDATETIME ) --import from CSV using bulk insert
BULK INSERT CSVTest FROM 'c:csvtest.txt' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '' ) GO
im gettin an error like Msg 4860, Level 16, State 1, Line 1 Cannot bulk load. The file "c:csvtest.txt" does not exist
The thing is i have created a CSV file in C drive with some values.
Now i would like to use the foreach loop structure in an SSIS package to loop through however many Excel files are placed in a directory and then perform an import operation into a SQL table on each of these files sequentially.
hii all, i've to import bulk data from excel file to sql server 2000 , i'm using asp.net 1.1 with C# and i've to make a front end(windows application) for this. help me out if any1 knows that... Thanks & Regards anant vijay
Hi, does anyone know how to access the *.evt file (SysEvent.Evt) from SQL in order to import a data from a file into a server table?
When you run EventViewer there is an option to save the log file as *.csv file and then use it. I want to eliminate this step (or make it automatic) and get the data into SQL right away.
Hi Guys We're designing a system to import Retail Sales Till data from 800 Shops with fairly modern Tills which will hold Transactions in Client Access Databases. Basic Data is(Shop_ID,Till_ID,Receipt_No,Receipt_No,Receipt_Lin e_No,Sales_Date,Item_Category,...........)Record Width of above in one De-normalized SQL Table is 46 BytesAverage is 1.5 Transactions per ReceiptAverage File size per shop is 150k - SFTP server will receive 800 of these per day.As we will be owning the software on each of the tills we expect the data is going to be very clean in terms of validation. All this has to be loaded into a SQL2005 Box overnight The Question is what File format should we choose to receive at HQ Current proposal is .XML - My proposal is CSV Another Question is how best to import the data Current Proposal is to write a .NET application to perform import - My Proposal is to just use SQL to do it. What do you guys think & why Sorry I've not been able to find a previous topic on Subject but will happily read if you have any links Further info on request GW
I have played around with SSIS in addition to reading an SSIS book front to back, but I am still a little confused as to how to import an Xml file with relational data.
Basically I want to import the Xml data into three tables: categories, products and fields. A product can belong to one or more categories and has one or more fields which store information about the product.
Using the Xml Source component I can load the Xml from the file, but I can only output one section (category, product or field) at a time. Since the relationship is infered from the hierarchical structure of the Xml (e.g. the fields don't store an ID of the product they belong to), I am not sure how to import the relationships into my tables.
If anyone has any tips on how I can go about that, then it would be most appreciated :)
Hi, i have a package that uses a ForEach loop component to import flat files, and uses an OLE DB Destination component to insert the data into some staging tables (using table fast load with a max insert commit of 1000 rows), the biggest individual table import would be circa 5000 rows. At the end of each file import a stored proc is called to transfer the data into production tables, then the next file is imported.
Periodically (when importing more than 5 or 6 files) the process fails with the error message:
The transaction log for database 'blah' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases
This occurs when committing the data to staging. I do not start any transactions during this process, and the staging tables are cleaned out by truncating them, so i'm not sure exactly what is causing the log file to fill up (i'm not a DBA).
I know this is not specifically a SSIS problem, but can anyone give me a suggestion about the best way to handle the log file during an SSIS import? Should i execute a DBCC SHRINKFILE before each flat file is imported? Is there some other approach i should be taking to either insert the data or to move it from staging to production?
I have exported a table from a sql 2000 database. This table contained an image column. How can I import this csv file into an sql 2005 database. When I try I get a data overflow error. When I tried defining the import data type to image I get an error this data type is not supported.
I then imported the data using dt_text. This performed the import but changed the data. I would guess it assumed unicode input.
I wanted to know if there was a way to import data from a flat file without specifiying the delimiters. I want to import each line in one row so that i can use the substring function to break of the data as an when i want and not as per the delimited format file or the wizard.
i.e if row one had "abc"|"1453"|"Jack"|"Smith"| etc.... rather than importing these as different columns and rows. I want this all in one row, one column.
Hi. I want to upload a Flat file from an asp.net website to a Sql server.. I have used olebb provider to upload excel files and a foxpro provider to upload DBF files and then used the sqlbulkcopy to put the data to my sql server..
SO i was wondering what kinda provider should i use to import a Flat file to the database and do i have use a format file in order to import it... Any help will be appreciated.. Regards, Karen
Error 0xc0202055: Data Flow Task: The column delimiter for column "Column x" was not found.
I would like to have the import continue., even the error occur. I have the error output to ignore, but doesn't seem to be working. Any suggestion on what to do next, or how to skip the row that create the problem.
Hi All, I am trying to import data via DTS from a CSV file. I have the "empty" tables already created with proper column names. Now, I have a subset data of each of these tables, which I am trying to import. I am facing problem in that, there is a datatype conflict between the source (CSV file) and destination (table already in the DB). All the data in the CSV file appears to be of DBTYPE_WSTR, where as in the table it is different (some are DBTYPE_WSTR, some are DBTYPE_DATE, and so on). Is there a way that I can import data successfully? This has become a work stoppage issue now. I had to actually go for this approach of creating the empty tables first and then importing data because, the backup file was very very large and could not be copied to our domain. Please help me out in this.
I have tried my file name vendor_insertFromFile2.sqlBULK INSERT (dbo.vendors VendorName, Street, City, Region, Country, PostalCode, Telephone, PortalId, Fax, Email, Website, CreatedDate, Unit, LastName, Cell) from 'D:AccessNAICSxPortal0_SampleData.txt'error messageServer: Msg 170, Level 15, State 1, Line 2Line 2: Incorrect syntax near '('.Data in the "CompanyName","Address","City","State","country","fullzip","FullPhone","Portal","Fullfax","Email","URL","date","ID","Contact_1","800No""DAVID STRAWN","460 FLOWERING TRL","GRAYSON","GA","United States","30017�,"770-277-8709",0,"“,"bkroom@bellsouth.net","WWW.BACKROOM4MEN.COM","09/02/05",1,"DAVID STRAWN","BRADRUSS","3353 S MAIN ST STE 130","SALT LAKE CITY","UT","United States","84115�,"801-975-0374",0,"“,"rusbrad@msn.com","WWW.BADANDNASTY.COM","09/02/05",2,"BRAD JENSEN",Thanks in advance on what is wrong or a link with working examples