How Wide Is Too Wide???

May 22, 2000

Is a table 45 columns wide too wide? Most of the data is very small( check boxes, one word fields etc.) I have one giant form that needs to be filled out and uploaded to the database. I don't want to fragment the table too much because it will be a pain to update it. This table will have to be searched, that's why I am concerned with the width. I am hoping that an index would help out and would provide enough performance that I can keep a wide table.

Thanks,
Nathan

View 4 Replies


ADVERTISEMENT

Wide Db Search

Jan 4, 2008

ok, i have been charged with the task of searching a server that houses 50-60 client databases. all databases are identical, created from a prototype. i have discovered an issue whereby one of the clients db is missing a required sproc. they now want me to check all db's to see if anyone else is missing the sproc or not.  i have a UN and PASS to login to the server that houses all db's. is there a method for me to query all db's at once and check for the sproc? or will i have to go through them all manually (which could become quite time consuming as there are so many db's, and quite a large number of sprocs in each. the sprocs seem to be listed in a semi-random order in each db as well)
thanks all

View 1 Replies View Related

Server Wide UDF

Jul 10, 2007

Is there any way to create Server wide UDFs? We have alot functions that are spread across multiple databases and its time consuming going around databases looking for them. What would be nice is to create functions at the server level which can be accessed within any database, like the GETDate() function.

Thanks in advance.

View 2 Replies View Related

Wide Primary Keys

Jul 23, 2005

I'm working on a system that is very address-centric and detection ofduplicate addresses is very important. As a result we have brokenaddresses down into many parts (DDL below, but I've left out somereference tables for conciseness), these being state, locality, street,street number, and address. The breakdown is roughly consistent withAustralian addressing standards, we're working on finalising this.Because we carry the primary key down each of the levels, this hasresulted in our address table having a very wide primary key (around170 characters). We refer to addresses from a number of other tablesand although my instinct is to use this natural key in the other tablesI wonder if we should just put a unique index on the natural key,create a surrogate primary key and use it in the other table. Anythoughts?CREATE TABLE dbo.States (StateID varchar (3) NOT NULL ,StateName varchar (50) NOT NULL ,CONSTRAINT PK_AddressStates PRIMARY KEY NONCLUSTERED(StateID))CREATE TABLE dbo.Localities (Locality varchar (46) NOT NULL ,StateID varchar (3) NOT NULL ,Postcode char (4) NOT NULL ,CONSTRAINT PK_Localities PRIMARY KEY NONCLUSTERED(Locality,StateID,Postcode),CONSTRAINT FK_AddressLocalities_AddressStates FOREIGN KEY(StateID) REFERENCES dbo.States (StateID))CREATE TABLE dbo.Streets (StreetName varchar (35) NOT NULL ,StreetTypeID varchar (10) NOT NULL ,StreetDirectionID varchar (2) NOT NULL ,Locality varchar (46) NOT NULL ,StateID varchar (3) NOT NULL ,Postcode char (4) NOT NULL ,CONSTRAINT PK_Streets PRIMARY KEY CLUSTERED(StreetName,StreetTypeID,StreetDirectionID,Locality,StateID,Postcode),CONSTRAINT FK_Streets_Localities FOREIGN KEY(Postcode,Locality,StateID) REFERENCES dbo.Localities (Postcode,Locality,StateID))CREATE TABLE dbo.StreetNumbers (StreetName varchar (35) NOT NULL ,StreetTypeID varchar (10) NOT NULL ,StreetDirectionID varchar (2) NOT NULL ,Locality varchar (46) NOT NULL ,StateID varchar (3) NOT NULL ,Postcode char (4) NOT NULL ,StreetNumber varchar (15) NOT NULL ,BuildingName varchar (100) NOT NULL ,CONSTRAINT PK_StreetNumbers PRIMARY KEY CLUSTERED(StreetName,StreetTypeID,StreetDirectionID,Locality,StateID,Postcode,StreetNumber),CONSTRAINT FK_StreetNumbers_Streets FOREIGN KEY(StreetName,StreetTypeID,StreetDirectionID,Locality,StateID,Postcode) REFERENCES dbo.Streets (StreetName,StreetTypeID,StreetDirectionID,Locality,StateID,Postcode))CREATE TABLE dbo.Addresses (StreetName varchar (35) NOT NULL ,StreetTypeID varchar (10) NOT NULL ,StreetDirectionID varchar (2) NOT NULL ,Locality varchar (46) NOT NULL ,StateID varchar (3) NOT NULL ,Postcode char (4) NOT NULL ,StreetNumber varchar (15) NOT NULL ,AddressTypeID varchar (6) NOT NULL ,AddressName varchar (20) NOT NULL ,CONSTRAINT PK_StreetNumberPrefixes PRIMARY KEY CLUSTERED(StreetName,StreetTypeID,StreetDirectionID,Locality,StateID,Postcode,StreetNumber,AddressTypeID,AddressName),CONSTRAINT FK_Addresses_StreetNumbers FOREIGN KEY(StreetName,StreetTypeID,StreetDirectionID,Locality,StateID,Postcode,StreetNumber) REFERENCES dbo.StreetNumbers (StreetName,StreetTypeID,StreetDirectionID,Locality,StateID,Postcode,StreetNumber))

View 3 Replies View Related

Making A Wide Table -- SQL Query

Sep 24, 2004

Hi Guys,
I have a requirement as follows:->

I have a table like below

EMPLOYEEID------- OWNS
********** ****
1-----------------car
1-----------------house
1-----------------dog
2-----------------house
3-----------------car
3-----------------bus
3-----------------shop
3-----------------hotel
3-----------------theater
3-----------------casino

Requirement:
I wanted to create another table based on the column values. For eg: I have to take the employee id and check for what value he has under owns column in the table. I take only 3 values and then these values should go to the newly created columns (owns1, owns2,owns3).
if there is no value for any of these columns it should have null values loaded in them.
The result of the modification should look like this:->

EMPLOYEEID-------OWNS1------OWNS2------OWNS3
*********** ***** ***** *
1-----------------car--------house------dog
2-----------------house------Null-------Null
3-----------------car--------Bus--------shop

Note: eventhough employeeid 3 owns more than 3 things we only take 3 of what he owns and populate to above coloumns.
In addition to it, the column OWNS will have more than 500 different values in them.

Its kind of urgent and if anyone knows how to , Can you please help me on this.
Thanks a lot.
-- Ragulan
;)

View 3 Replies View Related

Types Of Joins In Version Wide

Nov 1, 2007

Dear All,
what are different types of joins avialable with sql server6.0, 7.0, 2000 and 2005 and now 2008...

the reason behind the question is :

i've used the swiss Sql tool to boost the performance of my join queries. and the tool has given some best options. those are working fine with database but failing at application level.

i found the reason is because of Loop join, Hash Join, Merge join.


your valubel suggsions are needed.


thank you veru much

Vinod
Even you learn 1%, Learn it with 100% confidence.

View 7 Replies View Related

Custom Object Wide Roles

Jan 23, 2006

Can we create custom object wide roles? In the same manner that db_datareader in effect grants SELECT on all tables, can we create roles that affect all objects without having to explicitly grant the permission on every object?

View 1 Replies View Related

Changing All VARCHAR To NVARCHAR Database-wide?

Jan 15, 2005

Hi,
I have an ASP.NET application that uses VARCHAR extensively in the tables and, more importantly, stored procedures (a couple hundred of them).

This app needs to start accepting foreign language in some areas, so I was wondering if there was some way to go through the tables and, more importantly, the stored procedures and change all "VARCHAR" references to "NVARCHAR" ?

Are the stored procedures stored as a text file somewhere on the server? If so I could use some sort of "replace" software utility to go through and change all VARCHAR to NVARCHAR

thanks!

-Bret

View 2 Replies View Related

Upper/lowe Case Set In Server Wide

Nov 3, 2004

Hi,
Can we set the Uper/Lower case in server wide in SQL server 2000 as like case sensitive.
Thanks,
Ravi

View 6 Replies View Related

Analysis :: Split Wide Measure Group?

May 14, 2015

I have a wide fact table that I'm feeding to an SSAS cube. I was advised that splitting the measure group into two will improve performance when querying the cube.

I cannot find any documentation that supports this, in fact I get a blue curved line suggesting that I merge the measure groups since they have the same dimensionality and granularity.

I guess the best practice is what the blue line states, but without knowing the internals of SSAS I can undestand that a smaller measure group may be easier to handle, or create more specific aggregations for.

View 4 Replies View Related

Context_info / Connection Wide Variables / Information

Feb 26, 2008

Hello

I know that the following is possible:

declare @x varbinary(128)
select @x = convert(varbinary(128), 'Some user name')
set context_info @x

Then in a trigger, you can say:

UPDATE tbl
SET who_was_kilroy = convert(varchar, p.context_info)
FROM inserted i
JOIN tbl t ON i.keycol = t.keycol
CROSS master.dbo.sysprocesses p
WHERE p.spid = @@spid

I am searching for a more generic way (varbinary 128 is not big enough) to store and access connection wide variables

Any ideas?

Regards
Klaus

View 4 Replies View Related

Site-Wide Security: Restrict Access

Feb 27, 2007

Hi,

I have added several Active Directory groups and set the system roles for each to "System User" and set one of the groups (DBAdmin) to "System Adminstrator"

My issue is that even after doing this, the users in the other groups are able to access the "Configure site-wide security" link under Security and change the permissions. The only system permission these users have is "View shared schedules" so it doesn't seem that this should be possible.

I would appreciate any feedback on this issue. Thanks!

View 1 Replies View Related

Disabling Referential Integrity Database-wide On Import

Apr 10, 2006

Is there a way to disable referential integrity on all destination tables for an import?
Thanks.

View 1 Replies View Related

Split Wide, Denormalized Table Into Normalized Structure

Aug 27, 2002

Thanks for reading.

This is pretty long, hopefully it isn't rambling.

I'm building a system that imports data from several source, Excel files, text files, Access databases, etc. using DTS. The entire process revolved around MS SQL Server, by the way.

I figured I would create denormalized tables that mirror the Excel and flat files, for example, in structure, import data to those, clean up and remove duplicates there, then break those out into my normalized table structure later.

Now I've finished the importing part (though this is going to happen once a week) and I'm onto breaking up the denormalized tables.

I'm hesitating because I'm not sure I've made the best decisions in terms of process, etc.

I've decided to use cursors to loop over the denormalized tables and use batch insert statements to push data out to the appropriate tables.

Any comments? Suggestions? All is welcome.

I'm specifically interested in hearing back on the way I've set up the intermediate, denormalized tables and how I'm breaking them up using cursors (step 2 of the process below). Still, all comments are welcome. As are suggestions for further reading.

Thanks again...

simplified example
(my denormalized tables are 20 - 30 colums wide)

denormalized table:
===================
name, address, city, state, cellphone, homephone


normalized tables:
==================

tblPerson [PK_person, name, age, height, weight]
tblAddress [PK_address, FK_person, street, city, state, zip, addressType]
tblContact [PK_contact, FK_person, data, contactType]


I'm breaking up the denormalized tables like this (*UNTESTED*):
=================================================

DECLARE @vars.... (one for each column in my normalized table structure, matching size and type)

DECLARE myCursor CURSOR
FAST_FORWARD FOR
SELECT name, address, city, state, cellphone, homephone
FROM _DNT_myWideTable
INTO

WHILE @@Fetch_Status = 0
BEGIN
-- grab the next row from the wide table
FETCH NEXT FROM myCursor
INTO @name, @address, @city, @state, @cellphone, @homephone


-- create the person first and get the ID with @@IDENTITY
INSERT INTO tblPerson (name) VALUES (@name)

SET @personID = @@IDENTITY


-- use that ID to coordinate inserts across other tables
INSERT INTO tblAddress (FK_person, address, city, state, addressType)
VALUES(@person, @address, @city, @state, 'HOME')

INSERT INTO tblContact (FK_person, data, contactType)
VALUES(@person, @cellphone, 'CELLPHONE')

INSERT INTO tblContact (FK_person, data, contactType)
VALUES(@person, @homephone, 'HOMEPHONE')

END

View 1 Replies View Related

Howto Do Totaling In Column Wide In Matrix Table

Jun 14, 2005

Hi

View 10 Replies View Related

Removing Blank Space Caused By Wide Table

Apr 5, 2007

Hi all,



Report:

-For instance 2 small tables (eg. width 10cm = 3 inch?)

-And one wide table (eg. width 30cm = 10 inch?)

All separated by "insert pagebreak after table".



Problem:

When rendered, the pages with the small tables on have a lot of white blank space at the right of the table. This is probably caused by the big table on page 3.

This report is distributed by email in Excell format. So on sheet 1 and 2 there are a lot of white cells on the right of the tables. When trying to print, they just want to use the "landscape" option and the "fit to page" option. Because of the empty white cells, the fit to page option reduces the first 2 tables to a very small table which covers only 50 % of the page width. The other 50 % is reserved for the empty cells.



Off course, I know that deleting the empty cells offers a solutions, but it would be a lot more handier if there were no empty cells in the first place.



Anybody with a solution?

View 2 Replies View Related

Problem About Caculating Wide-char Strings' Character Count

Jan 8, 2008



it's like this, i have a temporary table such as
create table temp_table (str varchar(50))
and i have a data table
create table data_table (str varchar(20))

now i import my data(in which there is some corrupted lines) into the temporary table, they should be all ansi-character strings and no more than 20 characters, but now some wrong datas in which there are wide-characters are mixed in. as the result of these wide-characters, the corrupted strings each takes over 20 bytes, but i can't filter them out, as when i enter in "len(str)", the sql server returns character counts, instead of byte counts, i thought this should only happen when i was using a unicode date type!(e.g. nvarchar). but now the server also behaves like this on those ansi date types. it seems all string manipulating functions refering string length behaves like this.

so when i am trying to run:
insert into data_table select str from temp_table where len(str) <= 20
or
insert into data_table select left(str,20) from temp_table

it will always end up with a string truncating error
String or binary data would be truncated. The statement has been terminated

So now my problem is how to get the count of byte, but character, of a string containing wide-characters?


i'm using sql server 2005 standard version with sp2

View 2 Replies View Related

Integration Services :: How To Insert Rows In A Wide Table With Over 13K Columns

Aug 21, 2015

I have a flat file with 13K columns which I need to load in a wide table.

The flat file does not even have column names and no datatypes defined.

How to load data in the wide table?

Also if i choose to load the data in 13 different work tables.

How do I define datatypes in the flatfile connection manager in SSIS for 13000 columns ?

View 5 Replies View Related

SQL Server 2014 :: Best Way To Pull Login Data For Auditing System Wide?

May 29, 2015

I am trying to import this years worth of failed logins and last successful login for each user out of the logs using master.dbo.xp_readerrorlog. The script essentially loops through the linked servers I have on my DBA box and reaches out for the log data. It works, but here is the error I am getting on most of our production servers:

OLE DB provider "SQLNCLI11" for linked server "AWSCADENCEDB01" returned message "The partner transaction manager has disabled its support for remote/network transactions.".

Msg 7391, Level 16, State 2, Line 17
The operation could not be performed because OLE DB provider "SQLNCLI11" for linked server "AWSCADENCEDB01" was unable to begin a distributed transaction.

I know how to enable distributed transactions on the servers that error out, but if it is not needed for anything other then my audit script, I doubt the business will approve me turning on distributed transactions at those locations (so I am not even going to ask).

I am attempting to setup a singular audit .rdl with the information I want to review quarterly.

CREATE PROC [dbo].[Import_Login_Data]
AS
IF EXISTS (
SELECT 1
FROM master.sys.servers
WHERE is_linked = 1

[Code] ....

View 2 Replies View Related

VERY Wide Rows, 8060 Bytes Row Size Limit &&amp; SPARSE Columns

Apr 23, 2008

Hi,
I€™m trying to create a VERY wide table, with 1,000 columns of type varchar(MAX), nullable.
The CREATE TABLE statement (both in SQL 2005 & 2008), gives the following warning:


Warning: The table "WIDE_TABLE" has been created, but its maximum row size exceeds the allowed maximum of 8060 bytes. INSERT or UPDATE to this table will fail if the resulting row exceeds the size limit.

When I insert data into the table, filling all columns with small, 10-byte string values, I get the following error:

Msg 50000, Level 16, State 1, Procedure sp_pivot, Line 118

Cannot create a row of size 15034 which is greater than the allowable maximum of 8060.

I€™d like to verify this observation: each row is created with 2000 bytes of offset data (2 byte * 1000 columns), 125 bytes for null bitmap (1000 columns / 8 bits) and some more €œwasted€? row information. This leaves less than 6K for the data itself. But since not all columns can fit within the page, forwarding pointers in the row need to be created, 24 byte per column, which very quickly add up to more than 8K, thus the error. So the 8K limit is met for much less columns than the max 1024 column restriction.

Furthermore, in SQL 2008, SPARSE columns will not solve the problem (maybe save some €œmetadata€? space in case the columns are null, but if not, I€™m with the same problem again, or even worse, since now each value takes more storage space. The max 30,000 columns in 2008 is only for cases where the column values are really sparse€¦

Is this the right observation? if so, is there a workaround besides splitting to multiple tables?

Thanks,

Aviv.

View 7 Replies View Related

Finding Database Sizes And Unallocated Space On A Server Wide Scale

Apr 22, 2008



Morning forum,

I'm having a problem to which I'm sure the answer is simple...

All I want is a list of databases on my server with their allocated size and the free space within. Something similar to the first table that sp_spaceused gives you but on a server wide scale.

As I say, I'm sure there's a simple solution out there, but alas Google has failed me.

Thanks in advance,

Dan.

View 4 Replies View Related

Integration Services :: Flat File Destination Columns Are Too Wide (too Many Trailing Spaces)

Nov 6, 2015

I'm loading data from a sql server table into a flat file. The flat file connection manager has the following settings

GENERAL:

Format:Delimited
Text Qualifier:"
Header row delimiter: {CR}{LF}
Header rows to skip : 0
Columns:
Row Delimiter: {CR}{LF}
Column delimiter: comma(,)

View 4 Replies View Related

Transact SQL :: How To Bulk Insert Rows From Text File Into A Wide Table Which Has 1400 Columns

Feb 3, 2010

we can easily load a file into db tables. However, my main concern here is the number of columns in the file. A text file TEXT_1400.txt has 1400 columns. I am unable to load data to my db table using BCP or BULK INSERT commands, as maximum of 1024 columns are allowed per table in SQL Server 2008. 

We can still go ahead and create ‘Wide Table’ (a special table that holds up to 30,000 columns.  The maximum size of a wide table row is 8,019 bytes.). But when operating on wide table, BCP/BULK INSERT commands still fail. After few hours of scratching my head over BCP and BULK INSERT, I observed that while inserting BCP/BULK INSERT commands are unable to identify SPARSE columns and skip these columns, which disturbs column mapping and results in data conversion and trancation errors.
 
Is there any proper way to load this kind of files into the db table? 

View 6 Replies View Related

Case Insensitivity Is On Server Wide: Tables Render Case Sensative...

Jan 6, 2005

Hello:

I have created an SQL server table in the past on a server that was all case sensative. Over time I found out that switching to a server that is not case sensative still caused my data to become case sensative. I read an article that said you should rebuild your master database then re-create your tables. So after rebuilding the master database, a basic restore would not be sufficient? I would have to go and manually re-create every single table again?

Any suggestions?

View 4 Replies View Related

Odbc - Binding Sql Server Binary Field To A Wide Char Field Only Returns 1/2 The Daat

Jul 23, 2005

Hi ,Have a Visual C++ app that use odbc to access sql server database.Doing a select to get value of binary field and bind a char to thatfield as follows , field in database in binary(16)char lpResourceID[32+1];rc = SQLBindCol(hstmt, 1, SQL_C_CHAR,&lpResourceID,RESOURCE_ID_LEN_PLUS_NULL , &nLen1);and this works fine , however trying to move codebase to UNICODE antested the followingWCHAR lpResourceID[32+1];rc = SQLBindCol(hstmt, 1, SQL_W_CHAR,&lpResourceID,RESOURCE_ID_LEN_PLUS_NULL , &nLen1);but only returns 1/2 the data .Any ideas , thoughts this would work fine , nit sure why loosing dataAll ideas welcome.JOhn

View 2 Replies View Related

Report Objects Shift To Right On Very Wide Report - Why?

Mar 13, 2007

Greetings,

I have a very wide report of more than 20 inches. I've placed several parameter values in the report header section so that the user can see what filters have been applied to the data. The testboxes shift their position several inches to the right when the report is run from the Report Manager.

Is there a way to make sure that a textbox is displayed at an absolute position? I thought maybe there would be a property on the report or body object that controls this but I don't see one.

Thanks for your help,

BCB

View 7 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved