SQL 2012 :: Log File Data Transfer Amount (MB) Versus Data File Transfer Amount (MB)
Mar 19, 2014
In the full recovery model, if i run a transaction that inserts 10MB of data into a table, then 10 MB of data is moved in the data file. Does this mean then that the log file will grow by exactly 10MB as well?
I understand that all transactions are logged to the log file to enable rollback and point in time recovery, but what is actually physically stored in the log file for this transactions record? Is it the text of the command from the transaction or the actual physical data from that transaction?
I ask because say if I have two drives, one with 5MB/s write speed for the log file and one with 10MB/s write speed for the data file, if I start trying to insert 10 MB of data per second into the table, am I going to be limited to 5MB/s by the log file drive, or is SQL server not going to try and log all 10 MB each second to the log file?
View 6 Replies
ADVERTISEMENT
Sep 28, 2006
Hi All,
I used a data flow task, and when trying to transfer data from a OLE DB Source (records ~ 75 lac) to a destination OLE DB Source, SSIS fails at the middle giving an error saying the Transaction log got filled, try again after clearing the same.
My query is what is the most efficient way to transfer say records more than 50 lac ensuring that it doesn't fail in the middle?
Thanks in Advance,
Mithun.
View 5 Replies
View Related
Aug 16, 2007
Hi All,1. i want to transfer the .csv file data into sql server table, itried with the DTS but while creating DSN it not prompt to attechthe .csv file. give me the proper steps to perform the datatransfer...2. i want result of my query into excel or text file by using the sqlquery( like Select * from employee where emp_salary>10000 to 'c:emp.xls).i know the other way right click into query analyzer windowand select option result to file, but want the result by using SQLquery.....
Quote:
View 2 Replies
View Related
Mar 20, 2008
Hi All,
i have mutiple text file. let us say,a1.txtb1.txtc1.txt
i have to port this text file data into the table (SqlServer Database) which have the same file structure.(i.e)x1 (SqlServer table)y2 (SqlServer table)z3 (SqlServer table)
now i have to transfer a1.txt file data ----to--- x1b1.txt file data ----to--- y2c1.txt file data ----to--- z3
using SSIS. like that, i have to transfer more than 250 files at a time.manually binding 250 files into the package is very cumbersome and time consuming process.
so, can any one give ur valuable sugession to solve this issue.
View 2 Replies
View Related
Feb 15, 2008
I'm having problems designing a package to attempt to execute a fast load data transfer but failback to regular speed with error redirection in the event of an error.
The way I designed this was to add one data flow task to my package called "DFT FASTLOAD". The data flow copies a table SRC to another table DEST in the same SQL Server database. In the error handler for the data flow task I copied the original data flow task and changed the name to "DFT REGULARLOAD with Error redirection". In this data flow task I did not use fast load and addtionally redirected errors to a text file.
In the Data Flow Task "DFT FASTLOAD". I am copying from a varchar source field(with non-date strings) to a datetime destination field to force errors. However the Data Flow Task "DFT REGULARLOAD with Error redirection" never seems to start transferring data from source to destination. The data Flow Task "DFT REGULARLOAD with Error redirection" turns yellow (after the error occurs in "DFT FASTLOAD"), but no data is being transferred). It seems like it hangs.
Do I need to increase the MaximumError Count or something? The data flow task "DFT FASTLOAD" does not turn red when the error occurs it just remains yellow, so i assume I'm on the right track since it seems the error is caught.
I have added screenshots ... hopefully these screenshots will clarify my problem.
DESIGN:
http://i256.photobucket.com/albums/hh179/abzbank/DESIGN_FASTLOAD1.jpg
http://i256.photobucket.com/albums/hh179/abzbank/DESIGN_FASTLOAD2.jpg
http://i256.photobucket.com/albums/hh179/abzbank/DESIGN_FASTLOAD3.jpg
http://i256.photobucket.com/albums/hh179/abzbank/DESIGN_FASTLOAD4.jpg
http://i256.photobucket.com/albums/hh179/abzbank/DESIGN_FASTLOAD5.jpg
http://i256.photobucket.com/albums/hh179/abzbank/DESIGN_FASTLOAD6.jpg
RUNTIME:
http://i256.photobucket.com/albums/hh179/abzbank/RUN_FASTLOAD7.jpg
http://i256.photobucket.com/albums/hh179/abzbank/RUN_FASTLOAD8.jpg
http://i256.photobucket.com/albums/hh179/abzbank/RUN_FASTLOAD9.jpg
http://i256.photobucket.com/albums/hh179/abzbank/RUN_FASTLOAD10.jpg
http://i256.photobucket.com/albums/hh179/abzbank/RUN_FASTLOAD11.jpg
I can provide more details if needed... but really this is just a basic test.
Any assistance would be appreciated!
View 9 Replies
View Related
May 30, 2008
Is it possible/advisable when transfering very large amounts of data from server to server to:
trasnfer the data to a new table first
second alter new table adding indexes, defaults, ets based on original table
if it is what flow item would be used to transfer/alter the indexes and defaults?
I'm very new to ssis so the more detail you can give the better.
Thanks
View 5 Replies
View Related
Jun 11, 1999
i have a old database in foxpro and it has to be converted to sql server 6.5 database . the table in the foxpro has been broken into more than 1 tables in the sql server . so how can i transfer the data from 1 foxpro table to different tables in sql server 6.5.
vineet
View 1 Replies
View Related
Sep 1, 2014
I am a complete newbie to SSIS. I can create a simple package to transfer data between SQL instances and thats about it.
I have tableA (source data) and tableB (Destination data). TableA has 4 column and tableB has 5. I want to transfer all of the columns from tableA into TableB, but the 5th column in tableB needs to be populated with the ServerInstance name of the server TableA sits on. Do I need to have multiple data sources to achieve this? I have tried but no matter how I set it up, the Column in the destination is set to ignore.
View 2 Replies
View Related
Jun 5, 2015
I need to generate a csv file from another csv file, seems to be simple but let's go the trick thing:
Needs to have maximum 1000 lines, if I reach to this, I need to create another csv and fill that new one.
Exemplifying:
I have a csv file called fileA and this has 2000 lines and another csv called fileB with 1500 lines.
I need to loop a folder and get the fileA, create an output called FileAOutput and start to fill that, if I reach to 1000 lines, I need to create a FileAOutput_2 and fill the other 1000 lines...so I'll go to fileB and do the same thing, but in the second case, I'll have 500 lines in the second output.
View 5 Replies
View Related
Feb 12, 2014
I am trying to transfer all jobs from one instance to another by using data tools. However, once i tried to make smo connection i am getting this error;
A network-related or instance-specific error occurred while establishing a connection to SQL Server.
In order to solve this issue i have tried these solutions;
1. SQL Server should be up and running. (OK)
2. Enable TCP/IP in SQL Server Configuration (OK)
3. Open Port in Windows Firewall (FW ACCEPTS ALL LOCAL PORTS)
4. Enable Remote Connection (CHECK OUT THE sp_configure SETTINGS, i even right-click instance then see from properties )
5. Enable SQL Server Browser Service (sql server browser has been restarted)
6. Create exception of sqlbrowser.exe in Firewall (FW ACCEPTS ALL PROGRAMS)
7. I tried windows and sql authentication which has sysadmin role
8. INSTANCE name is also chekced millions of times
View 0 Replies
View Related
Apr 13, 2015
I got this situation where my network admin observerd that there is a high network utilization between 2 nodes in our AG (the primary node & the DR site, 2 separate locations of course); then he advised to compress the data transfer between those 2 nodes as the previous DBA already did that before!
Ok, I have no clue about this, so decided to google it, got nothing. My backup is already compressed through some third party app (just in case if that matters to the subject).
View 3 Replies
View Related
Feb 10, 2007
Hy im PCV
I want to know how to calculate the amount of data(in MB) that is transfered from 1 server trought another
Puplisher--->Subscriber, using a merge replication. I know that the amount of data depends on the number of the rows and the scale of the colums. I only want to know how to calculate that amount of data. I am using Sql server 2000, and a OS windows XP profesional, thank you
PCV
View 2 Replies
View Related
Mar 18, 2008
I'm creating a temporary table in a Sql 2005 stored procedure that contains the transaction amount entered in a period <= the period the user enters.
I can return that amount in my result set. But I also need to separate out by account the amounts just in the period = the period the user enters. There can be many entries or no entries in any period. I populate the temporary table this way:
SELECT
t.gl7accountsid,
a.accountnumber,
a.description,
a.category,
t.POSTDATE,
t.poststatus,
t.TRANSACTIONTYPE,
t.AMOUNT,
case
when t.transactiontype=2 then amount * (-1)
else amount
end as transamount,
t.ENCUMBRANCESTATUS,
t.gl7fiscalperiodsid
FROM
UrsinusCollege.dbo.gl7accounts a
join
ursinuscollege.dbo.gl7transactions t on
a.gl7accountsid=t.gl7accountsid
where
(t.gl7fiscalperiodsid >= 97
And
t.gl7fiscalperiodsid<=@FiscalPeriod_identifier)
And poststatus in (2,3)
and left(a.accountnumber,5) between '2-110' and '2-999'
And right(a.accountnumber,4) > 7149
And not(right(a.accountnumber,4)) in ('7171','7897')
order by a.accountnumber
Later I create a temporary table that contains budget information. I join these 2 temporary tables to produce my result set. But I don't know how to get the information for just one period. For example, if the user enters 99 as the FiscalPeriod_identifier, I need a separate field that contains only those amounts(if any) that were entered for each account in Period 99.
Can anyone help? It may be that I am not seeing the forest for the trees, but I can't figure it out.
Thanks very much.
Sue
View 6 Replies
View Related
Nov 20, 2006
Hi,I've an application, lets call it simply "A", which creates in a Microsoft Sql Database two huge tables.Lets call them "table1" and "table2"It safes really much data into this tables.After application "A" has finished another application is executed which deletes this two tables.Then application "A" is started again and it will create this two tables again, but the amount of data becomes bigger.It can only proceed if the tables were deleted completely before and the database is empty. This is the procedure which I repeat very often, but everytime the amount of data becomes bigger (table1 and table2 becomes bigger).A couple if times it works fine, but once it seems data becomes too big and application "A" fails. Mostlikely because the data wasnt removed correctly / completely. This is my code of deleting the two tables, maybee there is something I have to change:</p><p> try { SqlConnectionStringBuilder builder = new SqlConnectionStringBuilder("Server=mycomputerdbname;Integrated Security=SSPI;" + "Initial Catalog=testing"); builder["Server"] = "(local)dbname"; builder["Connect Timeout"] = 10; builder["Trusted_Connection"] = true; builder["Initial Catalog"] = ((ComponentConfiguration)this.componentConfig).Persistency.DatabaseName; SqlConnection sqlconnection = new SqlConnection(); sqlconnection.ConnectionString = builder.ConnectionString; sqlconnection.Open(); SqlCommand cmd1 = new SqlCommand("DROP TABLE table1"); // TO Do delete all tables SqlCommand cmd2 = new SqlCommand("DROP TABLE table2"); // TO Do delete all tables cmd1.Connection = sqlconnection; cmd2.Connection = sqlconnection; cmd1.ExecuteNonQuery(); Thread.Sleep(7000); cmd2.ExecuteNonQuery(); Thread.Sleep(7000); sqlconnection.Close(); Thread.Sleep(3000); } catch { }</p><p> </p><p> Thanks for help! mulata
View 6 Replies
View Related
Jul 2, 2007
Generally, on any screen, we design filter screen by allowing user to identify range or one value to search.
But sometime in some screen, It will be more convenient for user if user can identify No matter how value to search.
For example
On screen which have information of people in any province.
So user would like to search it by identify no matter how value to search.
There are check box of any province on filter part which enable users choose it.
Hence, if sometime user choose (by clicking on checkbox) 3 provinces : LA, Michigan, WachingtonDC to see description only 3 chosen province.
and sometime user choose (by clicking on checkbox) 2 provinces : LA, Michigan to see description only 2 chosen provinces.
Please give me any idea for create Stored Procedure or any tecnique to complete my idea....
Help me Pleaseeeee
View 1 Replies
View Related
Jan 12, 2006
hello
i have just created a test database and now need to insert a large number of records into one of the tables, we were thinking of about 1 million records, has anyone got an sql script that i could use to create these records
cheers
john
View 6 Replies
View Related
Aug 6, 2007
I need to make a job that will update up to 8000 rows with the list description of 'berkhold' to 'berknew' in SQL 2000. This is something that I have to do with several projects manually every day by doing the following 2 steps.
SELECT ListDescription, CRRecordID
FROM dbo_BerkleyGroupInventory
WHERE ListDescription ="BerkHold" AND CRCallDateTime<'1/1/2003' AND CRCallResultCode ='CC'
ORDER BY CRRecordID
I then scroll to the 8000th row and copy the CrrecordID and run the following query
UPDATE dbo.berkleygroupinventory
SET listdescription ='berknew'
WHERE ListDescription ='BerkHold' AND CRRecordID <=5968432 AND CRCallDateTime ='1/1/2003' AND CRCallResultCode='CC'
I'm sure there's an easier way to do this, but I'm very new to SQL and haven't figured it out yet
View 11 Replies
View Related
Oct 7, 2005
This is a general question on the best way to import a large amount of datato a MS-SQL DB.I can have the data in just about any format I need to, I just don't knowhow to import the data. I some experience with SQL but not much.There is about 1500 to 2000 lines of data. I am looking for the best way toget this amount of data in on a monthly basis.Any help is greatly thanked!!Mike Charney
View 5 Replies
View Related
May 18, 2001
I need to delete data from a particular table which has more than half a million records. The data needs to be deleted is more than 200,000 records from the table. What is the best way to delete the data from the table other than importing into a temporary table and performing the same operation?
Let me know if the strategy to be followed is okay.
1. Drop all the triggers
2. Drop all the indexes
3. Write a procedure with a loop setting ROWCOUNT to 1000 and delete the records. ( since if I try to delete all the rows it will give timeout error )
The above procedure will delete 1000 records for each batch inside the loop till it wipes out all the data for the specified condition.
4. Recreate Indexes and Triggers.
Please let me know if there are any other optimal solution.
Thanx,
Zombie
View 2 Replies
View Related
Jul 23, 2005
Hi,I am using SQL 2000 and has a table that contains more than 2 millionrows of data (and growing). Right now, I have encountered 2 problems:1) Sometimes, when I try to query against this table, I would get sqlcommand time out. Hence, I did more testing with Query Analyser and tofind out that the same queries would not always take about the sametime to be executed. Could anyone please tell me what would affect thespeed of the query and what is the most important thing among all thefactors? (I could think of the opened connections, server'sCPU/Memory...)2) I am not sure if 2 million rows is considered a lot or not, however,it start to take 5~10 seconds for me to finish some simple queries. Iam wondering what is the best practices to handle this amount of datawhile having a decent performance?Thank you,Charlie Chang[Charlies224@hotmail.com]
View 5 Replies
View Related
Jul 23, 2005
I have SQL 2000 and need to retrieve fairly large amout of data (~50.000 characters) in XML format and then insert it into the field ofthe text type.As 'FOR XML' can't be used with either local variables, INSERT INTO orSELECT INTO this makes "XML support" quite useless in many aspects.Can anyone please help me in solving this.Thanks a lot for your help and time.Pavel
View 1 Replies
View Related
Sep 30, 2005
i need to retrieve a large amount of data from the sql server database and make changes to one field and put the data back using a console application. how do i do it?
View 3 Replies
View Related
May 5, 2003
Hi,
My application needs to retrieve data from a table which has more than 15 lakh records. The records keep increasing in thousands every 15 days.
Is there anyway i can reduce the time to retrieve? basically i have a select statement with a few conditions and a clause for the id's of these records.
View 2 Replies
View Related
Jun 9, 2006
Dear,
I created a package getting data from files and database sources, doing some transformations, retrieving dimension id's and then inserting it into a fact table.
Running this package with a limited amount of data (about a couple of 100.000 records) does not result in any errors and everything goes fine.
Now running the same package (still in debug mode) with more data (about 2.000.000 rows) doesn't result in any errors as well, but it just stops running. In fact, it doesn't really stop, but it doesn't continue as well. If I've only been waiting for some minutes or hours, I could think it's still processing, but I waited for about a day and it still is 'processing' the same step.
Any ideas on how to dig further into this in order to find the problem? Or is this a known problem?
Thanks for your ideas,
Jievie
View 4 Replies
View Related
Jan 30, 2007
Hi Experts:
We have several database linked via merge replications. Due to business requirements, we need to delete 5M rows in one table, we did it on one subscriber. However, the publisher kept uploading the deletion operations from the subscriber and blocked any downloading operation from publisher to subscriber. How can we acceralte the replications now as this has already operated in 2 days, and will continue 1-2 days? Is it possible to set the publisher take the downloading before uploading? How to speed up large amount data deletion operations in replication environment?
Thanks in advance!
Ron
View 4 Replies
View Related
Sep 1, 2006
I have an interesting challenge. we are not allowed to allow users direct access to data in our SQL Server. Audit requires us to take the data out of our production server and pass it to the user. my situation is i have a table in SQl with over 100,000 records and growing. i want to pass that to an access data base. i am utilizing DTS and a data transform.
i s there a better way? the package is running slow and even appears to freeze. i see this amount of data as growing as well.
Don S
View 1 Replies
View Related
Jul 20, 2007
Recently we added a new table into our SQL2000 database specifically to store scanned in images of documents. This new table contains a PK field, a couple of datetime fields, a couple of char(1) fields and one 'image' field.
Before adding this table, the database size was approx 6GB. Six months after adding this new table, the database has grown to 18GB - 11GB of this is due to the scanned in images.
Would this new table affect the SQL performance with regards to accessing other data in the database that has nothing related to the new table?
If so, would moving this new table into it's own database be recommended?
Thanks
Rod
View 1 Replies
View Related
Aug 6, 2014
I am trying to add a field (bank_chk_amt] from a table in to my query to bring in a dollar amount....I originally had this below
SELECT dbo.CBPAYMENT.REFERENCE_NO, 'CCMCCHBREF' AS Literal, RTrim([description]) + ', ' + [bank_chk_amt] + ', ' + convert(char(10),[check_date],101) + ', ' + 'Refund check sent to ' + [payee_name] AS [Free Text]
FROM dbo.CBPAYMENT;
but I would get "Error converting data type varchar to numeric".So my co-worker modified it and added (str([bank_chk_amt]) to my query which worked, but I noticed it dropped of the cents. So instead of 80.35 it would show 80 And I noticed it rounded 100.50 to 101...How can I bring in the full dollar amount and without adding?
View 2 Replies
View Related
Nov 5, 2006
Hy everyone,
I have a problem at pulling data on PDA using Sql Mobile 5 from Sql Server 2000.
It work s for most tables, except for two . Every table of those two has about 1000 records and every record about 150 bytes, so all data is about 150 kbytes.
When I connect to the Sql Server 2000 through LAN, no problem. But when I connect through GPRS - VPN, the error occurs. But, the PDA- phone connection is via Bluetooth and at ping no TimeOut appears, the replies are about 600-700 ms. When connecting with a desktop computer through GPRS no such problem occurs.
So , I don't know what could be the cause: the connection or the Sql Mobile which doesn't retry enough in case of error to transfer all data .
A specialist from the GPRS provider told me to use 3G phone , but , because the GPRS phones have already been bought, this would be a too big investment. So, I don't think this is a good idea in this moment .
I found similar problem on the net and a solution was to use Pooling = False. But , in Sql Mobile there is no pulling parameter available when making the connection string .
Any help is very precious to me ,
Mihai .
View 10 Replies
View Related
Aug 10, 2006
Hi all,
I have a table named Prescription that consists of attributes like PatientId, MedicineCode, MedicineName, Prices of different drugs, quantity of different drugs(e.g 1,2,3,10), date .
I would like to get a summary of the total number and amount of different drugs in a specific period, the total amount of each type of drug.
I kindly request for help.
Thanx in advance.
Ronnie
View 4 Replies
View Related
Oct 22, 2006
Is it posible to transfer the directory of the logfile of a database without corrupting it? if yes How?
Thanks,
Keezeg
:beer:
View 1 Replies
View Related
May 19, 2004
I have to make a process/stored procedure that will either send or receive from a specific location (parameters: ftp server, username, password, filename, etc). Ne direction/help in order to do this. Or if anyone has done this before please help.
thx
View 3 Replies
View Related
Aug 4, 2015
Background information: SQL 2014 Ent. highly volatile OLTP environment. We generate 10 - 12 GB compressed transaction log backup files every 15 minutes.
Currently - we have two-node A/P cluster residing on flash array. Need to leverage AlwaysOn to offload processing. Replica server with have Flash storage. Replica node has same CPU and memory footprint. 10GB connection between nodes. Anyone generating such large transaction log for 15/30 minute time period?
View 0 Replies
View Related