Losing Table Attributes..
Aug 1, 2000. When I copy tables from one database to another (Using DTS Wizard) I lose my settings .. primary keys + default values !!
Any help would be appreciated..
.
Thanks
. When I copy tables from one database to another (Using DTS Wizard) I lose my settings .. primary keys + default values !!
Any help would be appreciated..
.
Thanks
I am working in SQL Server Master Data Services  Version 11.0.5058.0 (SP 2).
I have been asked to group all the financial attributes together. Â When I move one of the attributes up using the arrows, it works good jumping over one attribute at a time. Â Then I reach a section of attributes where it leap frogs over 24 attributes.
It appears these 24 attributes are in a subgroup but there are no attribute groups and I removed the subscription view from the entity. Â If I move one of the 24 attributes in the group, it moves it outside of the 24 attributes.
This is under leaf member attributes. Â There are no collection or consolidated groups.
I'm using a DW from Northwind database to build a cube to do some analitical taks. I already create the cube and now I am "cleaning" the dimensions. I'm having some difficults to understand the logical off this part. The reason is that When I create the Data Source View, I only import the Foreign Keys that connect the Dimensions to Fact_Table. I have to drag the attributes of Dimension from Data Source View to the tab attributes?Â
Imagine this:
I have the following dimension:
Dim_Customer:
Customer_ID
Name_Customer
Job_Function
Date_of_Birth
Contact
Address
City
Country
When I create the cube only Customer_ID appears in attributes tab, it's normal?Â
One more question:
I don't want to create a hierarchy like:
Customer ID -> Name_Customer
Customer ID -> Date_of_Birth
Customer ID -> Address
Customer ID -> City
Customer ID -> Country
My idea is to create the following hierarchy:Â
Name_Customer ->Â Date_of_Birth ->Â Â Address ->Â Â City ->Â Country
But the first hierarchy that I show is always appears to me. Do you know what is happens?
Hello all-
I have a specification table that has some attributes defined.
SpecId - Id of the specification
Attribute - Attribute of the spec. (Like Color, HP etc)
Value - Is the value of the attribute
Then I have a car table that actually has information about the cars. Intention is to take each specification and match the cars that match the specification. If the car has more attributes than the spec, we ignore the extra attributes for the match. But if the car has less attributes, we don't even consider the car as a match (even if the attributes present, match). To summarize, the car's attributes should be >= spec's attributes.
The code I have below is bad because I am joining the same tables twice. In addition, it fails in the condition "the car's attributes should be >= spec's attributes"
Any help is greatly appreciated.
DECLARE @Specification TABLE
(SpecId VARCHAR(10),
AttributeVARCHAR(100),
ValueVARCHAR(100))
DECLARE @Car TABLE
(CarName VARCHAR(10),
AttributeVARCHAR(100),
ValueVARCHAR(100))
INSERT INTO @Specification VALUES ('S1', 'Type', 'Sedan')
INSERT INTO @Specification VALUES ('S1', 'Transmission', 'Auto')
INSERT INTO @Specification VALUES ('S1', 'HP', '220')
INSERT INTO @Specification VALUES ('S2', 'Type', 'SUV')
INSERT INTO @Specification VALUES ('S2', 'Transmission', 'Manual')
INSERT INTO @Specification VALUES ('S2', 'HP', '300')
INSERT INTO @Car VALUES ('Accord', 'Type', 'Sedan')
INSERT INTO @Car VALUES ('Accord', 'Transmission', 'Auto')
INSERT INTO @Car VALUES ('Accord', 'HP', '220')
INSERT INTO @Car VALUES ('Accord', 'Color', 'Black')
INSERT INTO @Car VALUES ('Escape', 'Type', 'SUV')
INSERT INTO @Car VALUES ('Escape', 'Transmission', 'Manual')
INSERT INTO @Car VALUES ('Escape', 'HP', '300')
INSERT INTO @Car VALUES ('Explorer', 'Type', 'SUV')
INSERT INTO @Car VALUES ('Explorer', 'Transmission', 'Manual')
SELECT DISTINCT Spec.SpecId, Car.CarName
FROM @Specification Spec
INNER JOIN @Car Car
ON Spec.Attribute = Car.Attribute
AND Spec.Value = Car.Value
WHERE Spec.SpecId NOT IN (SELECT Spec.SpecId
FROM @Specification Spec
LEFT OUTER JOIN @Car Car
ON Spec.Attribute = Car.Attribute
AND Spec.Value = Car.Value
WHERE Car.CarName IS NULL)
Bit of a design question as I'm interested to know if anyone's done anythign like this...This is my main table (ish) Thing(ThingId, Ref)I then need to be able to give this "Thing" any number of attributes. Thing1 - Type:Red, Location:LondonThing2 - Type:Blue, Height:400, Width: 300Thing3 - Height:500, Location:Norwich But I have no idea how to model this in the database - it needs to be in such a way that I can add a Thing and all its attributes in one database hit basically (is there a stored procedure you could pass an array into?) My initial thoughts were to have Thing(ThingId, Ref) Attribute(AttributeId, ThingId*, AttributeTypeId*, Value) AttributeType(AttributeTypeId, Description) Is that completely mad? It seems like quite a lot of data accesses to enter a ThingIt could be Thing(ThingId, Ref, Type, Location, Height, Width) but then when "Thing - Color:White" comes along the model is stuffed Any ideas? (hope that makes sense)
View 3 Replies View RelatedIn query analyzer, what is the command to tell me the attributes of the entities in a table? In oracle I can use the describe command. I know their is a way to do it in Query analyzer but I can't remember how. Also I can look visually by expanding the node of the table. But if I can do this through the command line in query analayzer, it is sometimes quicker.
Example. I want to find out about a table named "Employee"
What command would I type that would tell me all of the columns/attributes in that table, and the data types which they are?
Bill
I am losing data from time to time and can not figure out where the rows are vanishing to. In essence, I have 5 files that I process. The processing occurs on a daily basis by one SQLAgent job that calls 5 individual packages, each doing a different piece of work.
I stage each file to a temp table, then perform some minor transformations, then union the rowsets from each file before landing to another temporary table. A subsequent package performs additional transformations that apply to the entire dataset, before inserting them to their final destination. Along the way, I reject some records based on the business rules applied. Each package in the entire Job is logging(OnError, TaskFailed, Pre and Post execute). Their are no errors being generated. No rows are being rejected to my reject tables either.
Without getting into the specific transforms, etc. being used in this complex process, has anyone seen similar unexplained behaviour? I've not been able to identify any pattern, except that it is usually only 1 or 2 of the record types (specific to a source file), that will ever not be loaded. No patterns around volumes for specific source files. There are some lookups and dedupes, however I've seen the records 'drop off' before reaching these transforms. I have noticed I had my final desination load not using fast load. However sometimes the records disappear before even getting to my final staging table which is inserting using fast load. I am going to turn on logging the pipelinerowssent event. Any other suggestions for troubleshooting/tracking down these disappearing records?
Thanks
I must be missing something somewhere...
I have a simple table with three fields: ID, LastName, FirstName. The ID is defined as the PK. In the table is a record of "12345, Smith, John". The incoming flat file has a record of "12345, Smith, Johnny".
In the SCD transform, the ID is the business key, and Last Name and First Name are defined as historical attributes.
During the load, the SCD transform correctly sends the data down the right path, but the insert fails with a primary key violation - as I would expect since it's trying to create a new current record.
How do I get around this problem without removing the PK ???
thx
I have a datetime field used to store a date of birth. When inserting the date of birth into the table it works fine. But when the date of birth is in the month of October, it takes an hour off. For example, if the date is 04-OCT-1993, it inserts as 03-OCT-1993 23:00.
It only happens for dates in October. It's being inserted via a .Net table adapter. It just so happens that October is the month that daylight savings kicks in parts of Australia (an hour is added), but I think this must be a coincidence.
I'd like to create a table that will store different order items. Several order items make up one single order. Order items can have 0 or more children (max depth will never be deeper than one). Order items can have up to 150 attributes/values. The way I think this should be done is using XML column instead of the EAV type of model. My table structure currently looks like this:
* child_order_item_id (PK)
* parent_order_item_id (FK to child_order_item_id)
* order_id (FK to Order table)
* product_id (FK to Product table)
* price
* attribute_XML
How my attribute_XML should look like or how to validate the xml.
I cannot remember how to pass a value to a stored procedure. I would work this through but I am really running out of time, any help greatly appreciated. This is my stored procedure and I need to pass CompanyID from the code behind page in for the stored procedure @C_ID value.
PROCEDURE dbo.EditCompanyInfo @C_ID int, @CS_CompanyName nchar(100), @CS_City nchar(50) AS UPDATE tblCompanyInfo_Submit SET CS_CompanyName = @CS_CompanyName, CS_City = @CS_City WHERE C_ID = @C_ID RETURN
This is my aspx. page:
<asp:Content ID="Content2" ContentPlaceHolderID="MainContent" Runat="Server" > <asp:TextBox ID="TextBox1" runat="server"> </asp:TextBox> <br /> <asp:TextBox ID="TextBox2" runat="server"> </asp:TextBox> <br /> <asp:TextBox ID="TextBox3" runat="server"></asp:TextBox><br /> <asp:DetailsView ID="DetailsView1" runat="server" DataSourceID="SqlDataSource2" Height="50px" Width="125px"> <Fields> <asp:CommandField ShowEditButton="True" ShowInsertButton="True" /> </Fields> </asp:DetailsView> <asp:SqlDataSource ID="SqlDataSource2" runat="server" ConnectionString="<%$ ConnectionStrings:ConnectionString2 %>" InsertCommand="CompanyInfoSubmit" InsertCommandType="StoredProcedure" OnInserted="SqlDataSource2_Inserted" SelectCommand="SELECT CS_CompanyName, CS_City FROM tblCompanyInfo_Submit WHERE C_ID = @CompanyID " UpdateCommand="EditCompanyInfo" UpdateCommandType="StoredProcedure"> <UpdateParameters> <asp:Parameter Name="CS_CompanyName" Type="String" /> <asp:Parameter Name="CS_City" Type="String" /> </UpdateParameters> <InsertParameters> <asp:Parameter Direction="ReturnValue" Name="ReturnValue" Type="Int32" /> <asp:Parameter Name="CS_CompanyName" Type="String" /> <asp:Parameter Name="CS_City" Type="String" /> </InsertParameters> </asp:SqlDataSource> <br /></asp:Content>
CODE BEHIND:
public partial class aaatest : System.Web.UI.Page{ int CompanyID; protected void Page_Load(object sender, EventArgs e) { if (!Page.IsPostBack) { DetailsView1.ChangeMode(DetailsViewMode.Insert); TextBox1.Text = "insert"; TextBox3.Text = Convert.ToString(CompanyID); } } protected void SqlDataSource2_Inserted(object sender, SqlDataSourceStatusEventArgs e) { { foreach (System.Data.SqlClient.SqlParameter param in e.Command.Parameters) { string RValue = Server.HtmlEncode(param.Direction.ToString()); if ( RValue == "ReturnValue" && Page.IsPostBack) { TextBox1.Text = Server.HtmlEncode(param.Value.ToString()); TextBox2.Text = "Return"; CompanyID = Convert.ToInt16(TextBox1.Text); TextBox3.Text = Convert.ToString(CompanyID); } } } }}
I have a stored procedure and one of the fields in the SELECT is a datetime type. On SELECT I am getting the date and the time but no AM/PM returned. The datetime in the query is returned in this format:
2008-03-21 08:37:20.000
Any help is appreciated.
We're using SQL Server 7 with an Access 2000 MDB as a front end with ODBClinked tables. I recently created a new set of tables for the app, and usersare complaining that unsaved data is being lost when they move to a newrecord. This seems to be the case when there are multiple users. When thereis a single user using it, we don't seem to have that problem.It seems that we had this problem when we first converted from an MDB backend to a SQL 7 back end, years ago, but we haven't had this problem in awhile. These are the first "entirely new" tables created in several years,and we seem to be having that problem again.Is this something with SQL 7 when it's dealing with new tables? Any ideas onwhat to do?Thanks!Neil
View 32 Replies View RelatedI'm a newbie to both Windows Forms development and SQL Server CE, so please bear with me...
I am developing a vocabulary builder application, and my data keeps reverting to the data set I originally imported. I keep losing the records I've added using the application I'm developing, or from within VS2005. I know about the 'Copy To Output Directory' option in the Solution properties and I set this to 'Do not copy'.
What gives? Not sure what other information you might need, but I suspect it's something really basic...
Thanks!
This isn't really DB related, except for the connection, but I was wondering if any of you guys had run into this before.
I have a ODBC System DSN Setup for a connection to my MSSQL2K Server. My users will occasionally lose the Default Database setting (Drop-down during DSN Setup), forcing me to go in and reset it. It's getting to be a pain in the ass and I was wonder if any of you guys had run into this before.
Thanks!
Ed
Following on from some problems I have been having with flat files, I seem to be stumbling from one issue to another.
I have coded my text file with some very contrived delimiters to ensure that my real data isn't tripping up the package. I am stripping all line break chars, tabs etc.
Now I seem to get all rows imported, but numerous integer columns within the file are being set to zero.
When I toggle the Keep Null options in the data flow task it will either import the row, but with the zeros, or not import the row at all. I have opened the file in text editors, and Excel and it all seems fine - in fact the same source file works OK with DTS/2000.
At a guess it seems as though these columns are seen as null by the package and so with the option switched on it defaults to zero - but they are most certainly not null in the file!! The table is a staging tabe and so is very basic in structure (no defaults or constraints)
SSIS seems a bit buggy or at the very least over sensitive to me - I am at the point of abandonment of it!!
Please has anyone seen similar issues?
Hi, i am having a problem when i tried to execute a SSIS package from a job in sql server 2005
First of all, when i create a new connection in the designer, i complete with server, user and password, and test connecion succesfull.
Then, i execute all the package and it works fine.
After building it, i tried tu execute the package using the file that it ve just been created into bin folder, but the connecttions didnt have the tag password, so after user id i add ;password=xxxx in every connection and it works fine. But every time i wantex to execute, i have to write the password.
Also i tried openeing the text file that the DTS created, but when i open it again, the password wasnt there.
Finally, i tried to create a job that execute this file, and i have to write the password as i did when i execute directly from the file, but i couldnt run. It has an authentication problem. If i open again the step configuracion, the password is not there.
The same that happen every time, but now i can run the package
Any idea?
Thanks a lot
Hi - I am attempting to run an SSIS package through a SQL Agent job step. The package is fairly basic - I run some Execute SQL tasks, build an ADO recordset, enter into a For-Each loop, and perform some table updates based on the ADO recordset. So far so good. Approximately 1/3 of the records in my For-Each loop are processed, then I get a "Failed to acquire connection error". All I am doing for each iteration of the For-Each loop is updating one record in a table based upon the current value that is being processed from the ADO recordset. I am running the job on a 64-bit box, version 9.00.3050.00. I have also run this on two other 64-bit boxes (versions 9.00.1399.06 and 9.00.1406.00) and had the same issue. The owner of the Agent job is a sysadmin on the box. I don't understand why some items would be processed, then the connection drops mid-way through processing. What's even odder is the connection is dropped at the same point every time I run. The connection that I have defined in my package (I only have one connection defined in my package) is Provider type "Native OLEDBSQL Native Client", and I am using Windows authentication. The version of Visual Studio that I have developed this package in is 8.0.50727.42. Please help.
View 7 Replies View RelatedHi,
I have written a little bit of VB.NET code that basically takes three strings, transforms them, and returns a single string to be stored in my table.
I am running into a strange problem, however... for some reason, a number of my processed rows are missing a pair of double quotes (").
The vast majority of the records are formatted properly, and have the double quotes in the expected locations.
The most frustrating thing about it is that I have included the offensive input strings in my Test.sql test script, and when I step through the entire routine, the return value is perfect...
i apologize for being the worst ever at posting questions here, please let me know if i can add anything
Hi,
I am facing a wierd problem while sync'ing. When I am do a synchonize from a mobile device to the server DB, one of my table column fails to update while all other columns get updated.
The table uses row level tracking (tried making it column level too) and the field which fails to update is a date field (Nullable).
This happens in case of syncing after updating >1 records on device. But syncing after updating just 1 record on device, this date field gets updated as expected.
Note: I have filters set for this table. The filter downloads fields only with this date field=null.
I need to downlaod all records meeting this filter condition but at the same time, this filter should not be applied while uploading changes as I think this is what the problem is.
If anyones has faced this and got a solution, please let me know.
Thanks.
We have several servers running SQL Server 2005. Each of these were upgraded from SQL Server 2000. Each server has about 50 databases on it. Each database has a table with the same name, containing login information to the database that the table resides in.
We have a VB 6.0 application that allows our Help Desk to do a SELECT on the table that contains login information. All it does is a SELECT; no UPDATEs, DELETEs, INSERTs, or DDL operations are performed by this application. The application logs into the databases using a Security Group login (let's call it MyGroup). Each member of the Help Desk team is a member of MyGroup.
About twice a month, the MyGroup security group loses SELECT permissions on all of the databases on a server. The server affected usually is different each time, but sometimes the same server will be affected two times in a row.
So far, we just run a script to update the SELECT permissions on the table in each of the databases, and this takes care of the problem (for now). But the problem seems to be recurring regularly.
What would make the server lose all permissions for a particular security group? The other logins continue to work fine. Only this one security group seems to be affected.
I just can't figure this one out! Help, please!
Nancy
Hi, I'm working on a project that connects to a database on every view of a site. It works fine for a few hours but eventually it just stops connecting to the database. There's nothing in the application logs to suggest where to look, it seems like the connections are just dying.
I am creating the connection in a pretty normal manner:SqlCommand myAwesomeCommand = new SqlCommand();myAwesomeCommand.Connection = new SqlConnection(myAwesomeConnectionString);myAwesomeCommand.CommandText = myAwesomeSQL;myAwesomeCommand.CommandType = CommandType.Text; // Otherwise known as awesomeText
myAwesomeCommand.ExecuteScalar();
I also use a SqlDataAdapter and fill it for some of my stuff for grabbing more complex data, but all happy System.Data.SqlClient stuff.
But yeah, the thing seems to just give up after a few hours. Everything I have read about connection pooling in ADO.Net says that the connections won't be reused. And yes, I believe I am closing all of my connections after I use them. The site is getting, oh, 7500 hits an hour maybe? Somewhere around that, maybe more maybe a little less.
Anyone have any ideas?
When I a set of databases from backup all login information is lost. The front end of the application provides a way to create new users, but it is a very long process. I am looking for a script or stored procedure that will help attach the users to the database.
View 3 Replies View RelatedI backed up and restored a database successfully except for the users. The only user that shows is dbo, but I can't add a user with the proper user name from my old server; I get the error that the user or role already exists in the current database.
When I go to role, name of public, there is the user I need to add. I can't remove it as it's a table owner, but I need to add it to the list of users, so I need to remove it then re-add it.
I tried using EXEC sp_changeobjectowner to no avail.
Thanks for any help/guidance.
I have 2 machines, each running the same version of SQL Server. I'm trying to use the dts to export my data from one to the other, and it works fine, except for the fact that I lose my unique index on a field. Is there a way I can keep this from happening, or am I going to have to recreate it each time? I wouldnt think I would have to, seems like bad design to me.
View 1 Replies View RelatedDoes anyone know how to reset or reorder a primary key id column without losing any data in the table.
at the moment my id's are: -
7
9
19
21
and I want them to be: -
1
2
3
4
Hi,
I'm testing the use of application roles for security. The customer I work for has still a lot of ASP intranet applications running. We're migrating the databases to a SQL Server 2005 server.
I've changed the connection string to a user without any permissions but to log on. After that I use an application role for permission to select different tables and to execute Stored Procedures.
The first queries do execute but after that I get "Permission denied", like I haven't got the application role anymore.
Any ideas?
Adrian
On my SSIS package I have 2 connection managers, both are connecting to an Oracle db. I enter in the ID and Password then click 'test connection' I'm able to connect to the database fine. I then go to a data flow of one of my control flow task. Open up my OleDb Source and click preview. The SQL query executes with no issues. I can see the data from my connection manager source.
I then go and run the package and I get this error message:
[OLE DB Source [1]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "OracleConnection" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
I've been racking my brain on this for a few days and still no luck on getting this to work.
Hi
We got a server running SQL server 2000 SP4 with a database that€™s being replicated
There are 17 subscribers running MSDE SP4 using merge replication. Replication is started manualy
Initially we tested this with two subscriptions an everything went well, but now, since 3 months, we are facing a weird problem while sync'ing. We have massive data loss on records that where inserted at the subscribers. Records seem to disappear, but only record that have a foreign key constraint. What I mean is that for example a record is inserted at the table that holds our client records with primary key €˜ClientID€™ and then a record is inserted in a table with actions for that client with a foreign key €˜ClientID€™ referring to the client table. After sync€™ing that client record is inserted correctly in database on the publisher but the records in the table with actions are gone.
As far as I know the tables are correctly formed with identity set not for replication and so on.
Shortly, I can€™t find any problem, a specially when it doesn€™t happen always.
If anyones has faced this and got a solution, please let me know.
Thanks.
Raf
I have a control flow setup with 5 tasks.
1. Create Temp Table (Execute SQL)
2. Load Temp Table (Execute SQL)
3. Data Flow Task using the temp table
4. Data Flow Task using the temp table
5. Drop Temp Table (Execute SQL)
Tasks 3 & 4 are currently setup to run parallel/concurrently - meaning they both branch directly off of task #2. They both also lead to task #5. Also, my connection manager is set to retain its connection.
Here's the problem. When executing this flow, task #3 will sometimes fail, other times it will succeed - without me changing anything between runs (I simply end debugging and run it again). When it does fail it is always because it cannot find the temp table. One thing to note is that task #4 (running at the same time as #3 and using the same temp table) always finishes first because it does quite a bit less work. Last thing to note: if I set up task 4 to run after 3 it works everytime, but of course it takes much longer for the total package to execute.
How is it that the task can find the temp table on some runs, but not on others? If this is just quirky behavior that I am going to have to live with for the time being, is there a way to force a task to re-run X number of times before giving up and moving on?
Thanks,
David Martin
Hi,
I've experienced a problem (bug?) when deploying reports to the report server. My users kept complaining about losing authentication to reports when I deploy a new report. I found the problem but don't understand why this is happening. My collegue developped the earliest reports and loggs in with a different user on Remote Desktop to develop the reports. When the report is deployed, the datasources are deployed also.
Now when I log in on Remote Desktop with another user then my collegue and open the same report project, and check out the datasources, the credentials are gone ... ?!? So when I deploy a report the datasource with the empty credentials are deployed too, wich causes all the authentication problems.
My Question: How comes I can't see the user credentials my collegue filled in? When I open the report project the credentials are gone. I log out, my collegue loggs in, and the credentials are back
I suppose this has something to do with Remote Desktop and the users connecting to it?
Thanks in advance!
I have written a simple SQL Server 2005 package to pull some data from Oracle (using ODBC) and pumping it into SQL Server. When I run it from the server in debug mode in VS it works fine. When I schedule the job it errors out with "ora-01005: null password given; logon denied." The password is there. Has anyone experienced this? Is there a security setting somewhere preventing me from saving passwords? Is there a work around? Thanks.
View 2 Replies View RelatedHi report data sources keep losing their credentials. We use custom datasources and store credentials on the server yet for some reason? the credentials get removed and have to be re entered.
Has anyone come acroos this before?