Large Amount Of Pages For Few Rows

Jul 23, 2005

Hello,

I have experienced that some of my tables occupies an extremely large amount
of pages but with few rows. An example is a table with 37 rows over 22000
pages !. The columns are simple integer and char. I fixed the problem by
introducing a clustered index. Now it only uses 1 page. But can anyone
explain this behaviour in SQLServer 2000 ?
regards Jakob Mathiasen

View 4 Replies


ADVERTISEMENT

Report Printing With Blank Pages When Large Amount Of Text In Column Values. Urgent

Apr 16, 2008

Hi every one,
I am facing problem in printing the reports from browser and also when i export it to pdf,the problem i am facing is blank pages are coming when report column getting the large amount of text around 2500 characters into column value.
can any one help me in this issue?. if the report is getting acceptable amout of data it is printing in proper way i.e no balnk pages at all.i maintained all properties like margins+body size < page size.

View 4 Replies View Related

Large Amount Of Text?

Feb 22, 1999

We have created a databse in SQL Server 7 for support issues.
We are having trouble with only one aspect of the databse and that is the body of the supported problem.

The databse is basically a Question/Answer Support database whch house Frequently Asked Questions.
When the user does a search on a specific problem they have a list of matching questions shows on the screen (links).
When the question is clicked on it shows the Questions with the Answer of possible fixes.

Our problem lies in the fact that SQL server is truncating the Answer portion. I have tried different Data Types with maximum lengths with no success it keeps truncating it.
Right now I am on Data Type nvarchar with a length of 4000.

If anyone has any pointers on how to solve this problem all input is appreciated. You can post here to the forum or e-mail me directly at jason@flnet.com

Thank You.

-JR

View 2 Replies View Related

Inserting Large Amount Of Data

Jan 12, 2006

hello

i have just created a test database and now need to insert a large number of records into one of the tables, we were thinking of about 1 million records, has anyone got an sql script that i could use to create these records

cheers
john

View 6 Replies View Related

SQL Import Of LArge Amount Of Data

Oct 7, 2005

This is a general question on the best way to import a large amount of datato a MS-SQL DB.I can have the data in just about any format I need to, I just don't knowhow to import the data. I some experience with SQL but not much.There is about 1500 to 2000 lines of data. I am looking for the best way toget this amount of data in on a monthly basis.Any help is greatly thanked!!Mike Charney

View 5 Replies View Related

INDEXIES – Inserting Of Large Amount Of Records

Feb 25, 2007

Is there any way how to create indexies after insertion of any
number of records (I dont want to create index after insertion of every record,
but for example after insertion of 1000 records) ?



 I heard it should be possible with „bulk insert“ or with
transactions. Is it right ? I need do this with MS SQL Server 2005 (Workgroup
edition).



 Thank you for your ideas!

Jan

View 5 Replies View Related

What Datatype To Use When Inserted Large Amount Of Text

Jun 27, 2005

What datatype to use when inserted large amount of Text ??

View 6 Replies View Related

Deleting Large Amount Of Data From The Table......

May 18, 2001

I need to delete data from a particular table which has more than half a million records. The data needs to be deleted is more than 200,000 records from the table. What is the best way to delete the data from the table other than importing into a temporary table and performing the same operation?

Let me know if the strategy to be followed is okay.

1. Drop all the triggers
2. Drop all the indexes
3. Write a procedure with a loop setting ROWCOUNT to 1000 and delete the records. ( since if I try to delete all the rows it will give timeout error )
The above procedure will delete 1000 records for each batch inside the loop till it wipes out all the data for the specified condition.
4. Recreate Indexes and Triggers.

Please let me know if there are any other optimal solution.

Thanx,
Zombie

View 2 Replies View Related

Admin Performance Issues, Large Amount Of DBs

Mar 10, 2004

I have a large amout of dbs (150) on my SQL Server and when using the enterprise manager to do administrational tasks like Backup, Restore etc. it takes 1.5 hour to open the Database folder. Server is 2xP4, 3Gb RAM. Any ideas on how to manage this number of dbs on the same server and instance of SQL.
Cheers!

View 2 Replies View Related

Website Making A Large Amount Of Commands

Jul 29, 2014

I have been in contact with my Hosting provider who have told me that my website seems to be making a large amount of SQL commands, and locking the tables.

View 3 Replies View Related

How Do You Improve SQL Performance Over Large Amount Of Data?

Jul 23, 2005

Hi,I am using SQL 2000 and has a table that contains more than 2 millionrows of data (and growing). Right now, I have encountered 2 problems:1) Sometimes, when I try to query against this table, I would get sqlcommand time out. Hence, I did more testing with Query Analyser and tofind out that the same queries would not always take about the sametime to be executed. Could anyone please tell me what would affect thespeed of the query and what is the most important thing among all thefactors? (I could think of the opened connections, server'sCPU/Memory...)2) I am not sure if 2 million rows is considered a lot or not, however,it start to take 5~10 seconds for me to finish some simple queries. Iam wondering what is the best practices to handle this amount of datawhile having a decent performance?Thank you,Charlie Chang[Charlies224@hotmail.com]

View 5 Replies View Related

How To Return Large Amount Of Data In The XML Format

Jul 23, 2005

I have SQL 2000 and need to retrieve fairly large amout of data (~50.000 characters) in XML format and then insert it into the field ofthe text type.As 'FOR XML' can't be used with either local variables, INSERT INTO orSELECT INTO this makes "XML support" quite useless in many aspects.Can anyone please help me in solving this.Thanks a lot for your help and time.Pavel

View 1 Replies View Related

Passing Large Amount Of Records As Single Parameter?

Jan 18, 2008

Hello all - I currently have a project that has a gridview on the front end. The user can select multiple items from this grid, hit a button and all of the selected records should be updated in the process. However, this resultset can have a large amount of data coming back, and I'm stumped on how to pass all of the ID's to the sp. I'd rather not call the SP for each record selected, as there could be 1,000 items selected, and well, I'd rather not call the SP 1000 times :p. I thought of generating a comma delimited list as I'm looping through the grid and using dynamic sql, but the IDs are about 6-7 numbers long, and including comma, would take up almost all of the max space in a varchar.
 Are there any good solutions to this problem? Passing the items as an array? Generating a data table in .NET and passing that?
 Any help would be appreciated.
-Jaime

View 4 Replies View Related

Console Application For Retrieving A Large Amount Of Data

Sep 30, 2005

i need to retrieve a large amount of data  from the sql server database and make changes to one field and put the data back using a console application. how do i do it?

View 3 Replies View Related

Reducing Time On Retrieving Large Amount Of Data

May 5, 2003

Hi,
My application needs to retrieve data from a table which has more than 15 lakh records. The records keep increasing in thousands every 15 days.
Is there anyway i can reduce the time to retrieve? basically i have a select statement with a few conditions and a clause for the id's of these records.

View 2 Replies View Related

How To Make Recursive Algorithm Without Using Large Amount Of Resources

May 4, 2015

How can I make a recursive algorithm in TSQL without using a large amount of resources?

New hires are put on a 6 month probation period, Their probation ends at the 6 month mark + sick time isnull(floor(sick_hours/8),0) they accrued during that 6 months. Their accrued sick days are added to the 6 month probation period to establish a new probation end date.

I then have to do the following loop:

1)Determine if any of the days between the original probation end date and the new probation end date have any weekend days or holidays for a particular employee.

2)If it does, I add those days to the new probation end date to create a newer probation end date.
a.I then go back up to step 1).

3)If it does not, I jump out of loop and look at the next employee on probation.

Note: I already have coding for determining number of sick days accumulated during an employee’s probation period. Recursive CTE? Recursive function? Other?

View 0 Replies View Related

Large Amount Data Deletion Blocking Other Operations

Jan 30, 2007

Hi Experts:

We have several database linked via merge replications. Due to business requirements, we need to delete 5M rows in one table, we did it on one subscriber. However, the publisher kept uploading the deletion operations from the subscriber and blocked any downloading operation from publisher to subscriber. How can we acceralte the replications now as this has already operated in 2 days, and will continue 1-2 days? Is it possible to set the publisher take the downloading before uploading? How to speed up large amount data deletion operations in replication environment?



Thanks in advance!

Ron

View 4 Replies View Related

Transferring Large Amount Of Data From SQL To Microsoft Access

Sep 1, 2006

I have an interesting challenge. we are not allowed to allow users direct access to data in our SQL Server. Audit requires us to take the data out of our production server and pass it to the user. my situation is i have a table in SQl with over 100,000 records and growing. i want to pass that to an access data base. i am utilizing DTS and a data transform.



i s there a better way? the package is running slow and even appears to freeze. i see this amount of data as growing as well.





Don S

View 1 Replies View Related

Does A Large Amount 'image' Data Affect Overall SQL Performance?

Jul 20, 2007

Recently we added a new table into our SQL2000 database specifically to store scanned in images of documents. This new table contains a PK field, a couple of datetime fields, a couple of char(1) fields and one 'image' field.



Before adding this table, the database size was approx 6GB. Six months after adding this new table, the database has grown to 18GB - 11GB of this is due to the scanned in images.



Would this new table affect the SQL performance with regards to accessing other data in the database that has nothing related to the new table?



If so, would moving this new table into it's own database be recommended?



Thanks

Rod

View 1 Replies View Related

Complexity Problem With Large Amount Of Records In A Link Table

Feb 1, 2008

A friend reminded me of a problem we tried to solve a few years ago and were unsuccessful.  Below is a copy of the email he sent me.  We would very much appreciate any ideas from the community.  Thanks!Lets start with a simple schema where you have 4
tables:

View 3 Replies View Related

Large Amount Of Text Pushing To Separate Page After PDF Render

May 25, 2006

I am having a small problem.

When I generate a report with a large amount of text data (Paragraphs) the report will not split the text between two pages.

It will move the entire text box to the next page leaving a large amount of space on the previous page.

I tried every control on the page to render this. I tried textboxes, tables, lists, rectangles, subreports, etc..

The data is stored in a SQL table using a text data type (there can be a large amount of data entered into these database.

Any help on this would be great. I like reporting services, but there are just a few bugs.

I am using Reporting Services 2000 with Service Pack 2 and Hotfix

http://support.microsoft.com/?kbid=912424 (That fixed a different PDF rendering issue) and this problem was occuring prior to this hotfix.

Thanks in advance

View 5 Replies View Related

Populating An Access Combo Box With Large Amount Of Data Causes Table Lock In SQL Server

Jul 20, 2005

I have a combo box where users select the customer name and can eithergo to the customer's info or open a list of the customer's orders.The RowSource for the combo box was a simple pass-through query:SELECT DISTINCT [Customer ID], [Company Name], [contact name],City,Region FROM Customers ORDER BY Customers.[Company Name];This was working fine until a couple of weeks ago. Now wheneversomeone has the form open, this statement locks the entire Customerstable.I thought a pass-through query was read-only, so how does this do atable lock?I changed the code to an unbound rowsource that asks for input of thefirst few characters first, then uses this SQL statement as therowsource:SELECT [Customer ID], [Company Name], [contact name],City, Region Fromdbo_Customers WHERE [Company Name] like '" & txtInput & "*' ORDER BY[Company Name];This helps, but if someone types only one letter, it could still bepulling a few thousand records and cause a table lock.What is the best way to populate a large combo box? I have too muchdata for the ADODB recordset to use the .AddItem methodI was trying to figure out how to use an ADODB connection, so that Ican make it read-only to eliminate the locking, but I'm striking outon my own.Any ideas would be appreciated.Roy(Using Access 2003 MDB with SQL Server 2000 back end)

View 2 Replies View Related

Need Help For SSIS Package Creation With INSERT,UPDATE Large Amount Of Records Through Business Intelligence Studio

Jun 1, 2006

Hi ,

How to INsert and Update Large Amount of Records (4 Lacs) into Destination Table Through Business Intelligence Studio Using SSIS Pacakge .How to Achieve this .i tryed with left outer join & conditional split but the problem its not able to insert & update records simultaneously . can any one give me a sample .
Thanks & Regards
Jeyakumar.M

View 3 Replies View Related

How To Select Max Amount Rows

Jul 20, 2005

please help to select these rows from these tables my tables aretable1table1Id groupId table2id price1 1 1 102 1 3 10003 1 4 5004 2 1 55 2 3 10006 2 2 2000table2table2id name1 hello2 test3 test14 test2ok i want to select maximum priced row from table1 grouped by groupidwith table2id and table2.namemy out put isgroupid price table2id table2.name1 1000 3 test12 2000 2 testif i do :select groupid,max(table1.price) as price from table1 group by pricethis will give me the max priced row from table1but when i join them with the table2 it gives me all rowslikeSelect groupid,max(price) as price,table2id,table2.name from table1inner join on table2group by groupid,table2id,table2.nameit gives me all rows cause i had to group by table2.id and table2.namebut i can't take it out cause it give me no aggrigated value errori can't figure out any other way, please helpSQL Server 2000thankseric

View 2 Replies View Related

Rows In Pages

Jun 27, 2006

Hi everyone,
I have a simple question about Rows in Pages for you.
In my opinion, rows are the storage of the data in databases.
I would like to ask that there are any important properties of Rows which I am not aware.

Thanks

View 3 Replies View Related

Returning Amount Of Rows From Sqldatasource

Oct 10, 2007

TotalSelected.Value = SqlDataSource1.SelectCommand = "SELECT COUNT(*) FROM tblNews";
 
the reason i am tring to do this is so if i can find out the amount of rows before sqldatasource selects for details view then i can make the sqldataesource select depends on total minus 5 so e.g. if total 200 then - 5 so i can select bottom 195 so it misses top 5 for details view any1 any ideas?
 Thanks Andy,

View 5 Replies View Related

Merge Subscribers Pull Twice The Amount Of Rows

May 26, 2007

We are running merge replication, SQL Server 2005 Enterprise with SQL Mobile 2005 (Windows Mobile 5) subscribers. Partitions are filtered on HOST_ID.



Ocassionally we experience a situation where a subscriber experiences an unusually long synchronization duration, and upon examining Replication Monitor, it appears that twice the number of rows, or X the number of rows (up to 7 times the number of rows) that should have been inserted are recorded as synchronized for the session: once the normal amount as inserts and once the normal amount as updates. This occurs for all tables in the subscription. This occurs on a first time synchronization to an empty subscriber database where there should be only inserts taking place.



I have examined the snapshot partition folders for these users on the file system and they appear to be identical in size and content as other subscribers. Checking the last partition snapshot job run and other characteristics for the the subscriber in question, everything appears to be the same as other subscribers functioning normally.



The HOST_ID for us is an employee ID used to filter employee specific data. I have seen this happen if the subscriber changes the value for the HOST_ID used in filtering, after the mobile database has already been populated (2 employees attempt to use the same mobile device and database). But, we have seen this happen recently where the HOST_ID was apparantly never changed midstream.



This just started happening recently. The only modification around the same time frame was the implementation of a custom business logic handler/custom conflict resolver that performs like "Latest Wins" but has logic added to update the a last-update datetime column for selected transaction tables at time of synchronization, so that an SSIS job can detect the changed rows for copying incremental database changes to another application database. This all seems to be working perfectly.



Any ideas?



Thanks,

Matt

View 1 Replies View Related

Question About Move Large Amount Of Data From Database To Database

Apr 23, 2007

guys,I have a project need to move more than 100,000 records from onedatabase table to another database table every week. Currently, usersinput date range from web UI, my store procedure will take those dateranges to INSERT records to a table in another database, then deletethe records, but it will take really long time to finish this action(up to 1 or 2 hours).My question is if there is some other way I should do to speed up theaction, I am thinking about use bcp to copy those records to datafileand then use bcp to insert it into SQL Server table. Is this the rightway to do it or should I consider other solution (then, what is thesolution.)Thanks a lot!

View 8 Replies View Related

Show Total Amount Of Rows In Specific Table

May 20, 2012

I need to show the total amount of rows in a specific table?

The query is as follows:

As part of the planning process to expand the database that supports Northwind operations, the IT manager would like to know how many rows are currently in specific tables so that he can conduct capacity planning.

The results needed include two columns, TableName( containing all the tables in the database and Rows, which contain the total amount of all the rows per table).

View 4 Replies View Related

SQL Rows In Pages Exceeding 8060 Bytes

Jan 14, 2008

This is a question that I have not had an opportunity to test. Was wanting to know if anyone in the SQl world knows the answer. In SQL 2K and 2005 your rolls are limited to 8060 bytes without using varchar(MAX). My question is do you have to specify varchar(max) before your roll can exceed 8060 or does SQL 2005 exceed without specifing varchar(Max). Also does SQL 2005 expand it across multiple pages automatically. Please assist if you can.


Thanks

View 4 Replies View Related

T-SQL (SS2K8) :: Increase Amount By Fixed Maximum Spread Over Several Rows

Oct 1, 2014

I have an issue where I have multiple rows of data and I need to reduce a dollar amount by a fixed maximum. I am going to throw some code in here to give a rudimentary idea of the data and what the final result should be.

declare @tbl table
(LineNum int,
Code varchar(2),
Amt money,
MaxAmt money

[Code] ....

I need to run an update so that the result of the following query:

select LineNum, Code, Amt, MaxAmt from

@tblLooks like this:

LineNum Code Amt MaxAmt
----------- ---- --------------------- ---------------------
1 AA 10.00 50.00
2 AA 20.00 50.00
3 AA 20.00 50.00

(3 row(s) affected)

I have tried cursors but got unexpected results or the MaxAmt always defaulted to the original even if I updated it. This seems like a simple problem but I have been banging my head against the wall for 2 days now. I've written some pretty complicated updates with less effort than this and I must have some mental block that is keeping me from figuring this out.

View 3 Replies View Related

Merge Join: Nr Of Output Rows Unchanged When Amount Of Input Changes

May 25, 2007

Dear all,



I created a package that seems to work fine with a small amount of data. When I run the package however with more data (as in production) the merge join output is limites to 9963 rows, no matter if I change the number of input rows.



Situation as follows.



The package has 2 OLE DB Sources, in which SQL-statements have been defined in order to retrieve the data.

The flow of source 1 is: retrieving source data -> trimming (non-key) columns -> sorting on the key-columns.

The flow of source 2 is: retrieving source data -> deriving 2 new columns -> aggregating the data to the level of source 1 -> sorting on the key columns.

Then both flows are merged and other steps are performed.



If I test with just a couple of rows it works fine. But when I change the where-clause in the data source retrieval, so that the number of rows is for instance 15000 or 150000 the number of rows after the merge join is 9963.



When I run the package in debug-mode the step is colored green, nevertheless an error is displayed:

Error: 0xC0047022 at Data Flow Task, DTS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Merge Join" (4703) failed with error code 0xC0047020. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.



To be honest, a few more errormessages appear, but they don't seem related to this issue. The package stops running after some 6000 rows have been written to the destination.



Any help will be greatly appreciated.



Kind regards,



Albert.

View 4 Replies View Related

Creating Index Time Increases Poroportional To The Amount Of Rows???

Apr 28, 2008

Hello!

I have a problem. I want to know if the time which is needed for creating an index increases proportional to the amount of rows. example: if creating an index on a table which 10.000 rows takes 15 seconds. does creating an index on a table with 20.000 rows take 30 seconds , 40.000 rows 60 seconds and so on...
or does it take longer like 10.000 rows 15 second, 20.000 rows 40 seconds, 40.000 rows 80 seconds.

thx for your help!!

Filipe

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved