Data Larger Than 255 Characters
Jul 20, 1998Is it possible to make a text-field or something like that, which can contain more than 255 characters ?
View 1 RepliesIs it possible to make a text-field or something like that, which can contain more than 255 characters ?
View 1 RepliesHi,We have troubles when we try to use the 'dbuse' calls with databaseslarger than 28 characters, looks like the dbuse truncates the nameafter it.Any ideas ???
View 4 Replies View Related
I have written code to combine, delete redundant data in my system. The table structure remains the same, except I changed some INTs to TINYINTs.
When I do sp_spaceused, it tells me that number of rows is smaller(which is correct), but the datasize, and index_size is significantly larger AFTER the deletions.
I tried using shrink, but that doesn't seem to change anything.
When I right-click the database, and choose PROPERTIES, it also confirms that the database got significantly larger.
I am confused about how deleting data and changing to TINYINTs could make my database bigger. What would cause this?
We have been trying for a while to use Power BI tools (Power Query, Power BI Designer (Desktop) and Power BI Designer (Cloud) in line with larger data sources ( 2-7m records in fact table). Unfortunately up to now with unsatisfactory results. It seems that these tools are just not designed to handle this kind of data volumes?
To my understanding it seems that the approach with PowerBI components is to always try to create a local (or cloud based) cache that only has a limited capacity. 2m+ records already seem to be exceeding this limit. So as opposed to firing off a query on demand (for example only based on distinct filter options as opposed to entire fact set) only the intermediary cache of the model can be used.
Our initial focus was to use a normalized table in SQL Server with around 4m records. First problem is that Power Query/Power BI Designer fails to provide a complete list of the distinct filter items:
This I can understand in some way as doing a distinct on a large data set like that is not trivial. As a workaround I could imagine to setup a star scheme dimension table with the distinct "dimension members". Â I.e. that filters are driven by this table which has only a few thousand rows. The filters there are then applied to the fact table. I haven't found a way to do this effectively with Power Query.Â
Another option that we have tried was using the cloud based Power BI service in conjunction with a SQL Cloud Service with the same data set. That  unfortunately didn't work. The setup of the source works fine but as soon as we try to start a query by dragging a value field on the dashboard errors occur:
Dear,
I created a package getting data from files and database sources, doing some transformations, retrieving dimension id's and then inserting it into a fact table.
Running this package with a limited amount of data (about a couple of 100.000 records) does not result in any errors and everything goes fine.
Now running the same package (still in debug mode) with more data (about 2.000.000 rows) doesn't result in any errors as well, but it just stops running. In fact, it doesn't really stop, but it doesn't continue as well. If I've only been waiting for some minutes or hours, I could think it's still processing, but I waited for about a day and it still is 'processing' the same step.
Any ideas on how to dig further into this in order to find the problem? Or is this a known problem?
Thanks for your ideas,
Jievie
I am using sql server 2000 developer editionMy table defintion is as followin: CREATE TABLE query_table (id IDENTITY (1, 1) NOT NULL ,qtext char (4000) )When i try to insert data with length more than 256 characterit only inserts only the first 256 characters.Please, let me know why this is happening.
View 2 Replies View RelatedI have a situation where I need to migrate data from an older platform to a newer one. The data from the old system(s) will be available on DAT tapes. All database construction on the new system will be identical to the old one in size and schema, except for one table (call it "ARCHIVE").
If the ARCHIVE table on the old system is 210MB, and the ARCHIVE table on the new system has the same attributes but has been expanded to 380MB in size, can I simply restore the dump for the old table into the new ARCHIVE?
Empirically it works (I have done it with apparent success two times) but I seem to recall that backups are done by pages, and I'm concerned that there may be conditions not being met by simply doing the restore the way I'm planning to do it.
Also, are there any tests or checks built into SQL which I can use to check table integrity on the target ARCHIVE table after the restore?
Any help is greatly appreciated.
Best rgds,
Kevin
We are planning to install more disk space and need to temporarily move the data and log files and move them back after the disk space has been added.
Is Detach/Attach the best way to handle this?
In my application I must store over 16000 character in a sql table field . When I split into more than 1 field it gives "unclosed quotation mark" message.
How can I store over 16000 characters to sql table field (only one field) with language specific characters?
Thanks
Hi,
i want to be alerted when any of my databases is more the xxxGB.
i know one method to do that and it is with the alerts in the SQL agent Performance condition ,but with this alert i needs to created alert monitor for every DB.
do you know a better way to achieve that ?
THX
I have two int fields in my database, CEOAnnualBonus and CEOBonus, and I want to return the value of whichever one has the larger value as CEOBonusCombined. I thought using COALESCE would do the trick like below but there are many cases where either CEOAnnualBonus or CEOBonus have a zero value instead of NULL and it doesn't work.
SELECT
COALESCE(CEOAnnualBonus, CEOBonus)
AS CEOBonusCombined
FROM tbenchmarktemp
WHERE Ticker='F'
Thanks for any help
I have a Windows 2003 server with SQL Server 2005 installed. Theserver is on small drive and we would like to upgrade to much largerharddrives. I've been hearing of problems using Ghost to get an imageand placing the image onto the new drive. I think this is more of aWindows 2003 problem, but this server is for nothing but the SQLServer databases. Does anyone have a clear method of moving thisserver to the larger drives?TIA.
View 3 Replies View RelatedGreetings all and thanks for reading this post.
Here is my situation... I have 2 fairly large databases. Full backups are 83gb & 63gb. I am in the process of moving these database to a new data center. I've taken full backups of these databases and shipped them to the new center. I have been taking transaction log backups (larger db every 24 hrs smaller db every 15 min ... from log shipping).
I want to restore these databases in the new data center. I've gone ahead and restored the dbs in the new location.
Question final cutover.. can I just apply the transaction logs to the databases on final cut-over or do I have to restore the database backup first then apply the transaction logs?
Is there an other way to do this that I'm missing?
Thanks.
Kurt
I have replication setup between our main site and a remote one, and have recently noticed that the database at the remote site's .MDF file is about 3 times as large as the main site's. This doesn't seem to make sense since essentially all of our data is replicated between the two servers. Can anyone suggest why this might be happening and what is safe to do to shrink the remote file?
TIA
Ron L
Hello.
I created SDF database files for "SQL compact" with a size larger than 128
MB (which is default for creation). Now when I try to open these files with
VS2005 I get the error "The size of the databasfile exceeds the configured
maximum...Required Max Databse size (in MB; 0 if unknown)" (translated from
german). The real problem is that I can not change the connectionstring in
VS, cause all field which show the connectionstring are readonly (greyed
out). I know I have to set the option "Max Database Size = 512" or so in the
connectionstring to get the things runnning, but don't know any way to do
that in VS2005.
My attempt to access the SDF files when copied to the device via active sync
results in the same error message.
This seems for me to be a design flaw, cause I can not add the optional
parameter to the connection string (even not in the details form, where only
"DataSource" and "Password" fields are displayed.
- Does anybody know a solution in VS2005?
- Does anybody have a workaround for me?
- Does anybody know where the connectionstrings of VS2005 are stored, so
that I may "hack" the connection string?
Thanks so far.
I created a report with RS in VS.NEt and set the width and height to 8.5in by 11in from the property window. When I designed it , everything fit on one page nice and neat.
When printed it prints on two pages and the font size comes out much larger than expected. The whole document seems to have been blown up bigger and the right side of the document has been cut off. Why is this? Do I need to configure vs.net to print????
Am I missing some setting somewhere?
Other documents print out fine on this printer, so it is not the printer.
Any help would be greatly appreciated, thank you
this seems small but if I can't get the report to print out right then..........
hi all
does anybody know why the fields of my db with the type "text" can store max. 64 characters? i thought fields of the type "text" could save unlimited characters. is it any wrong setting?i'm using visual web developer with sql server express
I exporting a table of comments. There are some line returns in the comments. Some of these data are paragraphs of data! For some reason, when I am exporting the data, it treats the line return within the comment column as a new record. I am using a -c character data type so (newline character) is the row terminator. How do I get the BCP OUT to ignore a newline character within a record?
For example:
ID~Comment
-- -------
1~This is a comment
2~Hi,how are (user hit carriage)
you (you is part of next row in bcp out)
3~Next record
Thanks!
Joyce
I have built a SSIS package that reads in data from a SQL Server 2005 source database into a flat file destination. The Row Delimiter is {CR}{LF}. The Column Delimiter is Tab {t}.
The data being read from the SQL Server database contains both {CR}{LF} and Tab {t} characters in various fields on several rows.
How can I process the input data from the SQL Server to remove these characters before passing it to the destination output file?
Sorry if this is obvious to all, but I am only just starting with SSIS...
Many thanks
Adrian
I have an existing Access database that I need to transfer over to a more powerful back-end due to the need for larger size capacity. We need to be able to have a backend that can exist up to just about any size due to us scanning in documents by ODBC. With Access I know I was limited to about 4gb size and when split onto my current SQL server I have heard I will be stuck at 10gb? If so can you recommend a better backend, but my question is about the front end. I hear Windows WPF can be linked into SQL server but does this limit the size as well?
View 3 Replies View RelatedBooks Online gives a way to send a message larger than the VARCHAR max of 8000 chars, but the @query argument to xp_sendmail is a simple text string and my data is much more complex, and formatted. Also BOL shows an example using a temporary text file, but it is not clear precisely how you write your insert statements. I tried the following, which writes out all the data and sends it ok except, after each row, there is about a page of blank spaces. What is wrong with my syntax?
SET LANGUAGE British
GO
DECLARE @msgstr VARCHAR(80)
DECLARE @cmd VARCHAR(80)
DECLARE @PMID INT
DECLARE @forename VARCHAR(30)
CREATE TABLE ##texttab (c1 text)
SET @msgstr = 'THE FOLLOWING QUOTES ARE CURRENTLY MARKED AS PENDING:'
INSERT ##texttab SELECT @msgstr
DECLARE C2 CURSOR FOR SELECT ProjMgrID FROM surdba.SVY_QUOTES WHERE StatusID=6
OPEN C2
FETCH NEXT FROM C2 INTO @PMID
WHILE @@FETCH_STATUS = 0
BEGIN
IF @PMID > 1000
SELECT @forename = ISNULL(Forename,' ') FROM surdba.SVY_PERSONNEL_GENERAL WHERE EmployeeID = @PMID
ELSE
SET @forename = ' '
INSERT ##texttab values (RTRIM(@forename))
FETCH NEXT FROM C2 INTO @PMID
END
CLOSE C2
DEALLOCATE C2
INSERT ##texttab values ( ' - This information is autogenerated from the Survey database.')
SET @cmd = 'SELECT c1 FROM ##texttab'
EXEC master.dbo.xp_sendmail @recipients = 'Robin Pearce',
@subject = 'ALL PENDING QUOTES',
@query = @cmd,
@no_header = 'TRUE'
DROP TABLE ##texttab
GO
Would appreciate any help on this one, I do not have time to learn HTML,
thanks
Robin Pearce
I have a SQL server with multiple instances on it and would like to move one of them to a drive with more storage. Â
I have SQL 2010 on a server with 2 partitions.
The database is located on the C: drive (original build) but the drive isn't partitioned to handle a db of the size that this one will grow to. I would like to move the full DB instance to another partition.
Hi
I find that users of my web site are using Canadian French character encoding such as ALT0233 (= é) or ALT0244 (= ô) when completing text boxes for data input on .aspx pages.
When saved to the SQL db, these characters are converted to é or ô and when retrieving the data, appear as é or ô in the text box.
The datatype in the table is nvarchar(60).
Data is saved using command.Parameters.AddWithValue("@PostingTitle", Server.HtmlEncode(Trim(Me.txtPostingTitle.Text)))
How can I save the data with the correct character inserted into the db and subsequently retreive the character.
Thanks in advance.
Hey everyone!
I'm doing an export from SQL into excel spreadsheet and then am going to clean out certain parts of the data with global search/replace. The problem is that the SQL data is full of special characters such as |'s and the little box looking characters.
How do I export without these characters?
I know its possible, I did it about 2 years ago and remember I did some crazy file conversion (make wk3 or something) but I no longer remember
Any help would be much appreciated!
Thanks,
Geoff
PS, attached is a screenshot of the data to give you an idea of what I'd like to strip!
I have a table with several columns of information that I wish to set up some form of schedule to go through this data and remove any special characters that may interfere with other code processes.
Mainly the coma's and the apostrophes. It really messes with my asp pages and scripts when retrieving this information and trying to do other things with it, so I need to figure out how to remove these from the tables so it does not cause these issues.
Knowing this, I cannot figure out how to keep the data in the row/column and just extract the special characters from that data. The other problem is, everything I try requires me to insert either a coma or apostrophe as part of the code string which in lies my issue.
How can I parse through my data, leave the data as-is, but just get rid of coma's, apostrophes, and double quotes?
Does anyone have a basic example that I can use to expand on?
Is there a way to change the font that the data viewer uses, so that the Chinese characters don't appear as boxes?
The data viewer displays Chinese characters as boxes, something similar to [_], at least on a computer with the following regional settings.
get-wmiobject CIM_OperatingSystem | ft OSLanguage, CodeSet, Locale
OSLanguage CodeSet Locale
---------- ------- ------
1033 1252 0409The data itself is flowing correctly into the target database with a pipeline data_type of DT_WSTR. The ideograms can be seen by query utilities which supports a unicode font (e.g. Management Studio).
Hi,
I'm having dificulties in loading data into a table coming from an excel file because one of the columns is a text based with an average of 1024 characters... How can i import that column? The excel source always shows me the column as a DT_WSTR of 255 characters...
Best Regards,
Luis Simões
Hi everybody,
I would like to know if there is any property in sql2000 database to separate lowercase characters from uppercase characters. I mean not to take the values €˜child€™ and €˜Child€™ as to be the same. We are transferring our ingres database into sqlserver. In ingres we have these values but we consider them as different values. Can we have it in sqlserver too?
Hellen
We have a database we are replicating to about 8 SQL Express subscribers from a SQL 2012 SP2 publisher. The size of the database grew too large for the 10GB license limit for SQL Express and now replication refuses to replicate any of our deletions on the publisher to reduce the size of the database. I've come up with a few options below.
1) Drop one of the larger table indices on the subscriber database to get below the size restriction. Permit the replication to replicate the deleted records and then rebuild the index. (I'm not sure how important an index is to this table. Is it merely performance related?)
2) "Upsize" SQL Express to SQL Standard on the affected boxes. Allow the deletes to replicate. Backup the database, downgrade to SQL Express and restore the database back to SQL a new SQL express instance. This would involve a lot of work on each box. I'd like to avoid it if possible.
I am comparing two fields one from our legacy table and one in our new table structure that should have identical text data. The new field has an assortment of ANSI characters where the legacy data did not have these. Is there anything I can do that will ignore all ansi character differences? The only route I can think of is just do a replace on each ANSI type on the new column but there are quite a few character types.
View 4 Replies View RelatedI am new to SQL but have managed to create a table with five columns. The problem I am having is when I try and run the INSERT command I get an error "Maximum characters exceeded in SQL statement"
My table code is:
create table myemployees_MPA0510
(FirstName Varchar (15), LastName
varchar (20), Title varchar (25),
Age number (3), salary number (9));
And my INSERT code is:
Insert into myemployees_MPA0510
(First, Last, Title, Age, Salary)
Values ('Jonie', 'Weber', 'Secretary', 28, 19500.00) ('Potsy', 'Webber', 'Programmer', 32,
45300.00) ('Dirk', 'Smith', 'Programmer II', 45 75020.00) ('Mark', 'Aldridge' 'Technical', 52,
12000.00) ('Peter', 'Wright' 'Admin' 30, 11000.00) ('Lucy', 'May', 'Technical', 15500.00)
('Robert', 'Hurst', 'Finance', 54, 16000.00) ('Ann', 'Green', 'HR', 43, 21000.00);
Hi,I'm trying to do retrieve some data from a table where the content isin Greek, however, thequery is not working. It's a very simple statement, but I'm missingsomething.Here is the table...if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[REPORT_LOCALE]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)drop table [dbo].[REPORT_LOCALE]GOCREATE TABLE [dbo].[REPORT_LOCALE] ([XL_REPORT_ID] [int] NULL ,[TEXT_NAME] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_ASNULL ,[LOCALE] [int] NULL) ON [PRIMARY]GOThe first statment shows me a number of rows. I copied the content ofthe Text_Name column and pasteit into QA to form the second statement. However, the second statementreturns no data.SELECT * FROM Report_LocaleSELECT * FROM Report_Locale WHERE Text_Name = 'Λογ.Διαχ. – ΤÏ?.-Î*Ï?ουπ.-Διαφ.'Hopefully the Greek characters will display properly within this post,but the idea is basically to take the Greek text and build that into aquery. I can do the remainder later once I understand why this doesnot work as I expect. I realise my expectation is based on doingthings in English so I need to understand the differences. We've donethis for various other languages using other character sets, which iswhy I am puzzled.Any pointers ?ThanksRyan
View 1 Replies View RelatedI have a source sql 2005 with the database collation SQL_Latin1_General_CP1_CI_AS and destination with sql 2012 with the same collation.
But the SQL server llvel collation is different, sql 2005 uses Latin1_General_CI_AI and sql 2012 uses "SQL_Latin1_General_CP1_CI_AS"
Now when i load the data from 2005  for one table to sql 2012 i could see special characters in one column. And i dont see that in the source database. Is there a way to avoid that or is it something we need to manually fix.