Efficiently Supplying Xml Data For Treeview

May 16, 2008

I have an interesting problem:

I have an ASP.NET web application that uses a Treeview control to display what can potentially be a very large data set. In the past, I would just run a recursive stored procedure in my database that would output the XML which I would save to a file. The Treeview used the XML file as its data source. I did this because it can take so long for the stored procedure to run (10 seconds or more) that, it isn't practical to have the treeview point directly to the stored procedure. This worked well enough because the data didn't change very often.

Now, it looks as if the application will be used in a production environment, and I really need to find a way to supply up-to-date data to the treeview in a dynamic way. I have tried creating a view that would provide XML and that would be updated any time the target table is updated but, that has not worked. I have also tried creating a trigger that would output to an XML file any time an edit was made (using the xp_cmdshell functionality) but, that has proven difficult as well.

Is there a simpler solution that I am just missing? I just want an up-to-date XML representation of the data that is a result of a recursive function.

Thanks for any help you can provide.

View 7 Replies


ADVERTISEMENT

Way To Efficiently Make Test Data ?

Dec 26, 2006

We have a website that accesses our SQL databases. In the past, we used our internal employees to improve our SQL databases. However, we want to outsource the work.

There is a lot of information we would like to keep private from the outsourcing.


Is there a way to efficiently make test data throughout our database without changing our original database?
Is a way to easily update our database to the new changes?

I found this product through Google€¦ EMS Data Generator for SQL Server
http://www.sqlmanager.net/en/products/mssql/datagenerator
Would this program help us make test data?

Thanks in advance
-Devin

View 2 Replies View Related

Building Treeview

Jan 3, 2008

 
userfeatureuserName  featureIda          1a          5b          1b          5b          9c          5 c          9
menuid   Pid  Name1         Administrator2     1   Create User3     1   Delete USer4     1   View log5         WSR6     5   X7     5   X8     5   X9         Manager10    9   Y11    9   Y
 Using the above table i want to create a treeview ie based on the user login. Please let me know if there is any previous sample to this situation.
 
 

View 2 Replies View Related

Treeview Query With T-sql

Feb 27, 2008

I need to query a sql database with sql
I need to return the data in a nested form
ID, ParentID, Title
I also need to know which level each item is with the query, making the result something like,
ID, Title, Depth
Any help?

View 3 Replies View Related

HELP Pls ! Treeview Using Sql Server

Jul 15, 2004

Hi guys,
i need some help here regrading my project

i would like to create a tree view diagram by importing my data from database and display it in the tree view.. however im not sure how to implement it

help please
regards

View 1 Replies View Related

T-SQL (SS2K8) :: Query For Treeview

Sep 11, 2012

I have a query which is working fine. Is it possible that if the table3's column(Child) is only related to table 1 to show it under table 1 and not under table 2, but at the same time another (Child) has a parent in table 2 (which usually is the case) it will show under table 2 as its currently doing.

In other words Child column is directly under Table2's row column name (Father), but occasionally it comes under Table1 with no relation to Table 2.

How can I out put that in a query for a treeview? I am assuming that I will have to program the out come in c# also with 3 for loops and in the second loop I can check if the column is grandchild or Child and make that as a second row or 2nd node of treeview, but I am having a problem building a query in sql.

The query below shows all Parent, then child then grand child(all well and working), but what is desired is at times child takes place of a father.

declare @x as xml
set @x =
(
SELECT distinct
Table1.AssetSysID, Table1.Asset_ID , Table1.FromLR, Table1.Asset_ID + ', ' + Table1.[Desc2] as GarndFather,
Table2.ACISysID ,Table2.PAssetSysID, Table2.FeatureName + ', ' + Table2.[DESC] AS Father,
Table3.ITMSysID ,Table3.Item_ID + ',' + Table3.[DESC] as Child

[Code] .....

View 6 Replies View Related

Design For Treeview-like Structure

Jun 7, 2007

ok i have a design question and since I am not a db designer I hope somebody can give me some insight into this...

I have an app that uses a treeview control to display a hierarchy of a machine assembly. Currently it only goes two levels deep (top level and a single subcomponent.
WHat I would like to do is enable my users to add n-deep levels to the top level machine. The problem with that is that I can't think of a way to store this in a DB and how the table(s) structure would look like.

It seems like this would be a classic problem in DB design, but that is where I lack knowledge so any help will be greatly appreciated

Thanx

View 6 Replies View Related

How Can L Rewrite This To Run Efficiently !!!!!!!!!!

Apr 18, 2002

How can l rewrite this and trim the code.

CREATE Procedure Disbursements_Cats
(@startdate datetime,
@enddate datetime)

AS
Begin

SELECT Loan.loan_No AS Loan_No,
Loan.customer_No AS Customer_No,
Customer.first_name AS First_name,
Customer.second_name AS Second_name,
Customer.surname AS Surname,
Customer.initials AS Initials,
Bank.Bank_name AS Bank_name,
Branch.Branch_name AS Branch_name,
Branch.branch_code AS Branch_code ,
Bank_detail.bank_acc_type AS Bank_acc_type,
Transaction_Record.transaction_Amount AS Transaction_Amount,
Transaction_Record.transaction_Date AS Transaction_Date,
Loan.product AS Product,
Product.product_Type AS Product_Type,
Product_Type.loan_Type AS Loan_Type

FROM Transaction_Record INNER JOIN
Loan ON Transaction_Record.loan_No = Loan.loan_No INNER JOIN
Product ON Loan.product = Product.product INNER JOIN
Customer ON Loan.customer_No = Customer.customer_no INNER JOIN
Bank_detail ON Customer.customer_no = Bank_detail.customer_no INNER JOIN
Branch ON Bank_detail.Branch = Branch.Branch INNER JOIN
Bank ON Branch.Bank = Bank.Bank INNER JOIN
Product_Type ON Product.product_Type = Product_Type.product_Type

END;
GO

View 1 Replies View Related

Trying To Build A Query For A TreeView / Navigation Menu

Mar 5, 2007

Hi Everyone,I'm drawing a blank here and I am hoping someone can point me in the right direction. I have a table with the following columns (some omitted)IDGUIDPageNameParentPageID I want to build a hierarchical navigation system (2-tier). The conceptual problem that I am running into was getting this information from the same table. Initially I was going to use a nested repeater but I am thinking a treeview would be better. Anyway, the problem is the query. How would I start with something like this? Let's use the following as an exampleRows(ID '1', GUID '888....', PageName 'Page A', ParentPageID '-1')(ID '2', GUID '111....', PageName 'Page B', ParentPageID '-1')(ID '3', GUID '222....', PageName 'Page C', ParentPageID '-1')(ID '4', GUID '375....', PageName 'Page 1', ParentPageID '1')(ID '5', GUID '562....', PageName 'Page 2', ParentPageID '1')(ID '6', GUID '874....', PageName 'Page 3', ParentPageID '2')
(ID '7', GUID '388....', PageName 'Page 4', ParentPageID '3') So, I want to be able to build a query so that I can do the followingPage A    Page 1    Page 2Page B    Page 3Page C    Page 4 Any help would be greatly appreciated. Thanks!  

View 4 Replies View Related

Major Problem With @@IDENTITY, Treeview And GUID

Nov 14, 2004

Hi,

I'm trying to insert data into locally stored database (SQL Server).
The data I want inserted, is presented in a Treeview control and the data is fetched from a Webservice. The data is returned in form of a dataset.
The treeview contains checkboxes allowing a user to select what to install in the locally stored database.

To sum up:


1. Get data from a webservice' not my problem
2. Present data in a Treview control' not my problem
3. Allow to user to select which data to install' not my problem
4. Insert data that the user has selected into my db' MY PROBLEM!!!!


The Treeview looks like this.

- Group1
| | ---- Rule1.1
| | ---- Rule1.2
|
- Group2
| | ---- Rule2.1
| | ---- Rule2.2
| | ---- Rule2.3

.....


The Treeview is generated with DataRelations between Group and Rule.

My locally stored database is designed by a third party provider and therefore the database must not be altered.
The table I want to store data in is called "Groups" and it looks like this:


GroupID uniqueidentifier ' (newid())
GroupName nvarchar(50)
ParentGroupID uniqueidentifier' if grouptype = 0 then ParentGroupID must have a value.
GroupType tinyint ' 0 = subgroup, 1 = "top"group

Data in the table "Groups" would look like this:

GroupID GroupNameParentGroupIDGroupType
---------------------------------------------------------
{000001...}Group1<NULL>1
{000011...}Rule1.1{000001...}0
{000012...}Rule1.2{000001...}0
{000002...}Group2<NULL>1
{000021...}Rule2.1{000002...}0
{000022...}Rule2.2{000002...}0
{000023...}Rule2.3{000002...}0



The third party also created a stored procedure called pr_AddGroup taking the following parameters:

@GroupName ' can be both the RuleName and the GroupName
@GroupType ' can be 0 for subgroup or 1 for "top"group
@ParentGroup ' GUID


The problem with this stored procedure is that it does not have return value, which is here my problem actually lies.
If it returned @@IDENTITY I could use this as the parameter for @ParentGroup.
Instead I figure I must create two sqlCommand's (one calling pr_AddGroup and another calling SELECT @@IDENTITY to get the newly created record).

My SQL Commands look like this

Dim cmd As SqlCommand
Dim Conn As SqlConnection = New SqlConnection
Conn.ConnectionString = "Data Source=myServer;Initial Catalog=myTable;Integrated Security=SSPI"
cmd = New SqlCommand
cmd.CommandType = CommandType.StoredProcedure
cmd.Connection = Conn
cmd.CommandText = "pr_AddGroup"

cmd.Parameters.Add(New SqlParameter("@GroupName", SqlDbType.NVarChar, 50, ParameterDirection.Input))
cmd.Parameters.Add(New SqlParameter("@GroupType", SqlDbType.TinyInt, ParameterDirection.Input))
cmd.Parameters.Add(New SqlParameter("@ParentGroup", SqlDbType.UniqueIdentifier, ParameterDirection.Input))

dim cmd2 as SqlCommand
cmd2 = new SqlCommand
cmd2.commandtype = commandtype.Text
cmd2.commandtext = "SELECT @@IDENTITY as ID FROM Groups"
cmd2.connection = Conn

dim ParentGroupGUID as system.guid


To get the data inserted in the Groups table I would something like the following, but the code is very ugly
(and it doesn't work either);

For Each Group In TreeView1.Nodes ' Loop through Groups
If Group.Checked Then
cmd.Parameters("@GroupName").Value = Group.Text.ToString
cmd.Parameters("@GroupType").Value = 1

cmd.ExecuteNonQuery()
ParentGroupGUID = cmd2.executescalar()

End If

For Each Rule In Group.Nodes ' Loop through Rules.
If Rule.Checked Then
cmd.Parameters("@GroupName").Value = Group.Text.ToString
cmd.Parameters("@GroupType").Value = 1
cmd.Parameters("@ParentGroup").value = ParentGroupGUID
cmd.ExecuteNonquery()
End If
Next
Next


I've spent the last 5 hours figuring out this problem, so ANY help is appreciated :-)

View 3 Replies View Related

Using Wildcards Efficiently With Equals Or LIKE

Jul 6, 2006

Is it possible to use wildcards with an equals statement? Such asSELECT * FROM Table WHERE City = '%' AND State='Ca'Bascially just stating where city equals anything...I know you can do it with a LIKE statement such as...SELECT * FROM Table WHERE City LIKE '%' AND State='Ca'but is that very efficient?The reason I want to do this is because I want to programmitcally set the city, so just ommiting it won't work
Also, using City LIKE '%' seems to not include NULL...is there anywayto include NULL as well as anything else?
Thanks for your help!

View 2 Replies View Related

Rewrite A Query Efficiently

Mar 15, 2007

Is there a efficient way to write this query?

SELECT CASE
WHEN Population BETWEEN 0 AND 100 THEN '0-100' WHEN Population BETWEEN 101 AND 1000 THEN '101-1000' ELSE 'Greater than 1000' END AS Population_Range,
COUNT(CASE WHEN Population BETWEEN 0 AND 100 THEN '0-100' WHEN Population BETWEEN 101 AND 1000 THEN '101-1000' ELSE 'Greater than 1000' END) AS [No. Of Countries]
FROM Country
GROUP BY
CASE WHEN Population BETWEEN 0 AND 100 THEN '0-100' WHEN Population BETWEEN 101 AND 1000 THEN '101-1000' ELSE 'Greater than 1000' END

View 1 Replies View Related

Ordering Records - Efficiently

Feb 8, 2008

I hope I explain this correctly...

I'm required to allow users to order items in a field to be displayed on a page in the order they specified.

For example: A user can drag and drop items in a list to specify the order it will be displayed in.

I have my drag and drop code ready to do this.
I have an idea on how to do this but I think it’s too inefficient.
I was going to create an orderby field and populate it with a number that corresponds to the position of the item. However, as one can deduce, if a user drags and drops a record between two others, I would have to change not only its orderby number but then change all the other items orderby number.

For instances if I dropped an item with an orderby number of 3 between 6 and 7 I would have to change the 3 to a 7 and then recursively change all the other records orderby numbers up to 3 and then change everything after 7.

Well, I hope I make sense. It’s easier to visualize it on paper.

Does anyone know how to tackle this issue of user dynamic ordering?

View 2 Replies View Related

Efficiently Joining Same Table Twice

Jul 23, 2005

My main table has the following structure:t1 (id_primary, id_secundary, name) i.e. [(1,1,"name1"), (2,1,"name2")]I want to join this table with the following second table:t2 (id_primary, id_secundary, value) i.e. [(1, NULL, "value1"),(NULL,1,"value2")]The join should first try to find a match on id_primary and only if thatfails it should find a match on id_secundary. Every row in t1 is matchedagainst a single row in t2.The following query works:selecta.name, isnull(b.value, c.value)fromt1 a left outer join t2 b on a.id_primary = b.id_primaryleft outer join t2 c on a.id_secundary = c.id_secundaryI'm wondering though if it would be possible to write a query that only usest2 once, since it actualy is quite a complex query that is calculated twicenow. Any ideas (besides using a temp table)?

View 3 Replies View Related

Efficiently Searching Multiple Words In A String

Feb 15, 2008

 Hi,I'd be interested in people's thoughts about the following.  A user on my site will be searching for a venue name, and that could officially include a sponsor which the user might not search for.  Now I am using the AutoCompleteDropdown from the AJAX Control Toolkit, so the user will start typing in a few characters and the results will be returned. I can generate the results from sql by doing a simple LIKE '%' + @searchTerm + '%' however, this fills me with great fear of table scans. At the moment, we'd be querying against a table of 5K records, but our application is very new.I'm thinking one option is to split the words into another table - a one to many relationship to hold each word of the venue.  The benefit of this would be that you could do a:LIKE @term + '%'but then I have the cost of the join. (And the added complexity which is not a major issue)Any thoughts/tips?Thanks!   

View 1 Replies View Related

SQL 2012 :: How To Efficiently Downsize Some Unicode Fields

Oct 12, 2015

We have a SQL Server 2012 Enterprise live transactional database that is now growing over 1G per month and is becoming a size problem for us. It is currently at 23G. Character type fields are all Unicode and I have calculated a savings of 5G in space converting only 2 such fields averaging 206 characters each to non-Unicode, and almost 10G in space if we convert a few more of them from nchar and nvarchar to char and varchar types. These fields will never have a requirement to hold Unicode characters that cannot be in the SQL_Latin1_General_CP1_CI_AS collation as they come in as plain ASCII originally and will always do so per the protocol standard.

I’m the software architect and chief C# developer though only a DBA hack or I would not have designed our database to have Unicode fields for high volume tables that did not need Unicode for those fields when the database was created 3 years ago. I want to correct this mistake now before we finalize converting to an AlwaysOn environment to support with various performance and backup issues.

After downsizing these two or more fields, we would like to shrink the database one time to take advantage of the space savings for full backups, and for seeding an AlwaysOn environment.

1.What is the safest and most efficient conversion technique for downsizing columns from nchar/nvarchar to char/varchar types? Esp. when there are multiple fields in the same table to be converted. I tested doing an “add new column, set new=old, drop old, rename old to new” for both of the main two fields I want to convert from nvarchar(max) to varchar(max), and it took 81 minutes on our test server (4 virtual core, 8G memory) before running out of disk space even though there was 8G left on the disk, and the db has unlimited size set (Could not allocate space for object 'dbo.abc'.'PK_xyz' in database 'xxx' because the 'PRIMARY' filegroup is full). I did delete an old database before it finished after getting a disk warning so maybe it did not count that new space. Regardless it was too slow. And this was on just the two largest of these columns (12.6M rows) and only ran 2 to 3% CPU busy so seemed not very efficient, and indicated unacceptable downtime if we were to convert even these two fields much less any additional fields. Average field size for these two fields was only 206 characters or 412 bytes each. Another technique I plan to try is to create the new table def in a new schema, select into it from the old table, then move tables amongst schema and delete the old table. I have a FK and indexes to contend with on the table.

2.If I figure out how to do #1 efficiently within an acceptable maint window, what is the safest practice for doing a one-time shrink and end up with organized/rebuilt indexes and updated Statistics? I understand the logic of not doing regular shrinks and that sometimes it can actually increase the size.

3.Is there any third party tool that could take a backup and restore it into a new database with the modified field definitions or otherwise convert certain field types?

View 9 Replies View Related

Need Opinions On Creating A Reporting Database More Efficiently

May 27, 2006

Situation:
SQL Server 2000.
At my new employer they have a production database on one server and a copy of it that is set to read only on another server which is used for reporting.

#1
They have an SQL Server Agent job on the production server that: (2 times a day)

Backs up the production database
Copies the backup file to a directory on the reporting server. (Its pretty big and can take time if there are problems with the LAN)

#2
They have an SQL Server Agent job on the Reporting server that: (scheduled to run 2 hours or so after the job on server 1 has run€¦they figured that it would be a safe bet that the backup and copy process of the first job would be done by then)

Breaks the user connections to the reporting database
Performs a restore on the reporting database using the backup file that was copied to the holding directory by the production job.
Sets some permissions for various users.
Sets the reporting database to READ ONLY.
What I would like to do is find a more efficient way to create this reporting database, I have started doing research into DTS methods but would like some opinions from more experienced users.

Thank You,
Wade

View 1 Replies View Related

Efficiently Creating Random Numbers In Very Large Table

Jan 19, 2007

Hello,

I need to sample data in a very large table in SQL Server 2000 (a gazillion rows of Performance Monitor statitics).

I'd like to take the top 5%, for instance, based upon a column containing random numbers.

Can anyone suggest a highly efficient method of populating a column with random numbers.

Thanks in advance.

Rod

View 10 Replies View Related

Sampling Data Set Via Integration Services Data Flow For Data Mining Models Without Saving Training And Test Data Set?

Nov 24, 2006

Hi, all here,

Thank you very much for your kind attention.

I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.

Thank you very much in advance for any help.

With best regards,

Yours sincerely,

View 5 Replies View Related

System.Data.SqlClient.SqlException: The Conversion Of A Char Data Type To A Datetime Data Type Resulted In An Out-of-range Datetime Value.

Dec 14, 2005

After testing out the application i write on the local pc. I deploy it to the webserver to test it out. I get this error.

System.Data.SqlClient.SqlException: The conversion of a char data type to a
datetime data type resulted in an out-of-range datetime value.

Notes: all pages that have this error either has a repeater or datagrid which load data when page loading.

At first I thought the problem is with the date, but then I can see
that some other pages that has datagrid ( that has a date field) work
just fine.

anyone having this problem before?? hopefully you guys can help.

Thanks,

View 4 Replies View Related

Data Reader Or Data Adapter With Data Set?

Dec 4, 2007

I have used both data readers and data adapters(with datasets) in the projects that I have worked on. I am trying to get some clarification on when I should be using which one. I think I am doing this correctly but I want to be sure I am developing good habits.

As the name might suggest, it seems like a datareader is for only reading data. I have read that the data adapter and dataset are for a disconnected architecture. Or, that they can be used for this type of set up. I have been using the data adapter and datasets when writing to a database and the datareader when reading from a database.

Is this how these should be used? Is the data reader the best choice for reading data? Am I doing this the optimal way from a performance stand point?

......................................................thanks in advance

View 1 Replies View Related

Master Data Services :: Master Data Services - Data Push Back To Excel Sheet

Nov 2, 2015

We already integrated different client data to MDS with MS Excel plugin, now we want to push back updated or new added record to source database. is it possible do using MDS?  Do we have any background sync process to which automatically sync data to and from subscriber and MDS?

View 4 Replies View Related

Ntext Over 4000 Chars Causes 'Data In Row (n) Was Not Update... String Or Binary Data Would Be Truncated...'

Oct 18, 2006

When I enter over 4000 chars in any ntext field in my SQL Server 2005 database (directly in the database and through the application) I get an error saying that the data could not be updated because string or binary data would be truncated.Has anyone ever seen this? I cannot figure out what is causing it, ntext should be able to hold a lot more data that this...

View 7 Replies View Related

SQL Server Admin 2014 :: Change Data Capture(CDC) For Data Warehouse / Reporting?

Aug 12, 2015

I have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.

The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?

Is there any performance impact if we enable CDC on OLTP db?

Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?

What is the best way to implement CDC to take incremental changes for reporting.

View 0 Replies View Related

How To Convert To Regular Text, Data Stored In Image Data Type Field ????

Jul 20, 2005

Hi,This is driving me nuts, I have a table that stores notes regarding anoperation in an IMAGE data type field in MS SQL Server 2000.I can read and write no problem using Access using the StrConv function andI can Update the field correctly in T-SQL using:DECLARE @ptrval varbinary(16)SELECT @ptrval = TEXTPTR(BITS_data)FROM mytable_BINARY WHERE ID = 'RB215'WRITETEXT OPERATION_BINARY.BITS @ptrval 'My notes for this operation'However, I just can not seem to be able to convert back to text theinformation once it is stored using T-SQL.My selects keep returning bin data.How to do this! Thanks for your help.SD

View 1 Replies View Related

Integration Services :: SSIS VB Script Loading Data Into Oracle DB Missing Some Data

Nov 10, 2015

I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below: 

SQL Server:
Oracle:
DDL:

create table Person
(
BusinessEntityID Integer,
FirstName nvarchar2(50),
MiddleName nvarchar2(50),
LastName nvarchar2(50)
);

Result:

I follow up this article: [URL] ....

VB Script: 
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper

[Code] ..........

View 8 Replies View Related

Pipeline Error-excel Source-data Reader Does Not Read In Meta Data

Apr 16, 2008

Hi all, i got this error:


[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".

and also this:

[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.


I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected.
It seems that somehow the 3 columns are not read in from the source file?
ans alslo fiscal year, fiscal week is not set up up properly in my data destination?
anyone faced such errors before?

Thanks

View 13 Replies View Related

Data Access :: Arithmetic Overflow Error Converting Expression To Data Type Int

Jul 24, 2015

When I execute the below stored procedure I get the error that "Arithmetic overflow error converting expression to data type int".

USE [FileSharing]
GO
/****** Object: StoredProcedure [dbo].[xlaAFSsp_reports] Script Date: 24.07.2015 17:04:10 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

[Code] .....

Msg 8115, Level 16, State 2, Procedure xlaAFSsp_reports, Line 25
Arithmetic overflow error converting expression to data type int.
The statement has been terminated.
(1 row(s) affected)

View 10 Replies View Related

I Am Accessing Data Using Data Access Pages In IIS 7 To SQL Server 2005 Authentication Is Failing

Feb 5, 2007

is there a step by step paper to get there? here is what i need to consider. I Iwill have many customers that will need their own set of records and access pages "branded for their company" each customer will have many clients. I am hosting this application on a windows 2003 server with SQL 2005 server enterprise.

I am using windows authentication, I have created a username in windows, then i added the windows user in SQL management studio in security, granted "DB Read" and "DB write" and again under the database security tab. still from the web authentication fails. i must be nissing a step or two?

I expect to set up a username for each database as i setup new customers.

View 1 Replies View Related

XML Data Source .. Expression? Variable? Connection? Error: Unable To Read The XML Data.

Feb 23, 2008

RE: XML Data source .. Expression? Variable? Connection? Error: unable to read the XML data.

I want my XML Data source to be an expression as i will be looping through a directory of xml files.

I don't see the expression property or the connection property??

I tried setting the XMLData property to @[User::filename], but that results in:

Information: 0x40043006 at Load XML Files, DTS.Pipeline: Prepare for Execute phase is beginning.
Error: 0xC02090D0 at Load XML Files, XML Source [108]: The component "XML Source" (108) was unable to read the XML data.
Error: 0xC0047019 at Load XML Files, DTS.Pipeline: component "XML Source" (108) failed the prepare phase and returned error code 0xC02090D0.
Information: 0x4004300B at Load XML Files, DTS.Pipeline: "component "OLE DB Destination" (341)" wrote 0 rows.
Task failed: Load XML Files
Information: 0xC002F30E at Bad, File System Task: File or directory "d:jcpxmlLoadjcp2.xml.bad" was deleted.
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
The program '[3312] Package.dtsx: DTS' has exited with code 0 (0x0).


Thanks for any help or information.

View 3 Replies View Related

Integration Services :: SSIS - Managing Data Integrity When Importing Sharepoint Data

Sep 28, 2015

I setup this package to import data from a Sharepoint list to a SQL Server data table. The primary key of my SQL table is mapped to the Title column of my Sharepoint list. There is a possibility that duplicate values will be entered in the Title field of the Sharepoint list. So when importing data into my table via SSIS, my package always error-out when there it comes across duplicate values. how you others have managed data integrity when importing from a Sharepoint list with the Title column being mapped to the primary key of a table.

View 4 Replies View Related

Data Conversion From String To Decimal When Saving Data To SQL Server 2005 Using An ADO Recordset

Feb 12, 2008

Hello,

I am wondering what conversion rules apply, when a string, which contains a number, is saved to a SQL Server 2005 into a column of type decimal.

This is the code I€™m using (C++):

CString cValue = "0.75"
_variant_t vtFieldValue;
vtFieldValue = _variant_t(cValue)
pRecordSet->Fields->Item["MyColumn"]->Value = vtFieldValue;

"pRecordSet" is an ADO recordset. The database column "MyColumn" is of type "decimal(19,10)".

The most important question for me is, if the regional settings of the database server or the regional settings of the client PC are considered during the conversion from the string to the decimal value. For example in standard French regional settings the "." would not be recognized as decimal separator.

I am also wondering if the language of the database instance, in which this data is saved, is considered during this conversion or any other settings of this database instance.

So my general question is: Does anybody know exactly what rules apply during the above mentioned conversion?

Thank you for your help.

Regards,
Volker

View 2 Replies View Related

Power Pivot :: Structural Data Model Changes In Data Source Leads To Errors

Oct 12, 2015

I've question about how to handle structural datamodel changes in a datasource of PowerPivot. Suppose I'm developing a starmodel in SQL Server and sometimes a datatype changes or a name of a field changes in a table. It seems to me that PowerPivot handle this not gracefully as Analysis MD does (mostly). I received an error because of a wrong fieldname or even no error when a dattype changes in PowerPivot. Is this common or do I something wrong here. Does this mean that every time the datamodel changes the PowerPivot should be recreated? Or am I missing the clue here?

View 6 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved