Showing posts with label nightly. Show all posts
Showing posts with label nightly. Show all posts

Thursday, March 29, 2012

Automating Restoring of *.BAK files

I have set up a Maintenance Plans on a SQL Server 2000 SP3a on one server to
create flat file backup of full databases to *.BAK files nightly.
Is it possible to automate the restoring of such BAK files on another SQL
Server 2000 SP3a on another server (assume I have in place scripts for
copying the BAK files from the source server to the destination server)? If
so, how?http://msdn.microsoft.com/library/en-us/adminsql/ad_automate_42r7.asp
--
David Portas
SQL Server MVP
--|||Yes, I know how to create a job in general, but what exactly do I run to
restore a BAK file?
"David Portas" <REMOVE_BEFORE_REPLYING_dportas@.acm.org> wrote in message
news:1106572451.019746.272450@.f14g2000cwb.googlegroups.com...
> http://msdn.microsoft.com/library/en-us/adminsql/ad_automate_42r7.asp
> --
> David Portas
> SQL Server MVP
> --
>|||Use the RESTORE DATABASE command in a Transact SQL job step. See Books
Online for details of the RESTORE DATABASE command.
--
David Portas
SQL Server MVP
--|||Taking a step back, I am just wondering whether a flat-file backup-restore
would be the best way to synchronise 2 SQL Server 2000 databases? Or should
I go for a DTS package to export database on the source server to an Access
mdb file and import it on the other end? Sometimes, I find that the users
in an exported flat file, following an import on another server is not
"usable" even if the referenced user are already defined on the destination
server.
"Patrick" <patl@.reply.newsgroup.msn.com> wrote in message
news:%23CWx88hAFHA.2552@.TK2MSFTNGP09.phx.gbl...
> Yes, I know how to create a job in general, but what exactly do I run to
> restore a BAK file?
>
> "David Portas" <REMOVE_BEFORE_REPLYING_dportas@.acm.org> wrote in message
> news:1106572451.019746.272450@.f14g2000cwb.googlegroups.com...
> > http://msdn.microsoft.com/library/en-us/adminsql/ad_automate_42r7.asp
> > --
> > David Portas
> > SQL Server MVP
> > --
> >
>|||"Patrick" <patl@.reply.newsgroup.msn.com> wrote in message
news:%23S0tVOhAFHA.3416@.TK2MSFTNGP09.phx.gbl...
> I have set up a Maintenance Plans on a SQL Server 2000 SP3a on one server
to
> create flat file backup of full databases to *.BAK files nightly.
> Is it possible to automate the restoring of such BAK files on another SQL
> Server 2000 SP3a on another server (assume I have in place scripts for
> copying the BAK files from the source server to the destination server)?
If
> so, how?
>
Yes.
In my case I wrote a stored proc on the restoring server and called it from
the backing up server.
CREATE procedure restore_FOO as
declare @.backup_file as varchar(255)
select @.backup_file=physical_device_name from
nell.msdb.dbo.backupmediafamily where media_set_id in (select
max(media_set_id) from BAR.msdb.dbo.backupset where database_name='foo')
print @.backup_file
restore database FOO from disk=@.backup_file with
move 'SearchActivity_Data' to 'e:\sql_data\FOO_data.mdf',
move 'SearchActivity_Log' to 'f:\SQL_LOGs\FOO_log.ldf',
move 'SearchActivity_Index' to 'g:\sql_index\FOO_Index_Data.NDF',
replace
GO
>

Automating Restoring of *.BAK files

I have set up a Maintenance Plans on a SQL Server 2000 SP3a on one server to
create flat file backup of full databases to *.BAK files nightly.
Is it possible to automate the restoring of such BAK files on another SQL
Server 2000 SP3a on another server (assume I have in place scripts for
copying the BAK files from the source server to the destination server)? If
so, how?
http://msdn.microsoft.com/library/en...omate_42r7.asp
David Portas
SQL Server MVP
|||Yes, I know how to create a job in general, but what exactly do I run to
restore a BAK file?
"David Portas" <REMOVE_BEFORE_REPLYING_dportas@.acm.org> wrote in message
news:1106572451.019746.272450@.f14g2000cwb.googlegr oups.com...
> http://msdn.microsoft.com/library/en...omate_42r7.asp
> --
> David Portas
> SQL Server MVP
> --
>
|||Use the RESTORE DATABASE command in a Transact SQL job step. See Books
Online for details of the RESTORE DATABASE command.
David Portas
SQL Server MVP
|||"Patrick" <patl@.reply.newsgroup.msn.com> wrote in message
news:%23S0tVOhAFHA.3416@.TK2MSFTNGP09.phx.gbl...
> I have set up a Maintenance Plans on a SQL Server 2000 SP3a on one server
to
> create flat file backup of full databases to *.BAK files nightly.
> Is it possible to automate the restoring of such BAK files on another SQL
> Server 2000 SP3a on another server (assume I have in place scripts for
> copying the BAK files from the source server to the destination server)?
If
> so, how?
>
Yes.
In my case I wrote a stored proc on the restoring server and called it from
the backing up server.
CREATE procedure restore_FOO as
declare @.backup_file as varchar(255)
select @.backup_file=physical_device_name from
nell.msdb.dbo.backupmediafamily where media_set_id in (select
max(media_set_id) from BAR.msdb.dbo.backupset where database_name='foo')
print @.backup_file
restore database FOO from disk=@.backup_file with
move 'SearchActivity_Data' to 'e:\sql_data\FOO_data.mdf',
move 'SearchActivity_Log' to 'f:\SQL_LOGs\FOO_log.ldf',
move 'SearchActivity_Index' to 'g:\sql_index\FOO_Index_Data.NDF',
replace
GO

>
sql

Automating Restoring of *.BAK files

I have set up a Maintenance Plans on a SQL Server 2000 SP3a on one server to
create flat file backup of full databases to *.BAK files nightly.
Is it possible to automate the restoring of such BAK files on another SQL
Server 2000 SP3a on another server (assume I have in place scripts for
copying the BAK files from the source server to the destination server)? If
so, how?http://msdn.microsoft.com/library/e...tomate_42r7.asp
--
David Portas
SQL Server MVP
--|||Yes, I know how to create a job in general, but what exactly do I run to
restore a BAK file?
"David Portas" <REMOVE_BEFORE_REPLYING_dportas@.acm.org> wrote in message
news:1106572451.019746.272450@.f14g2000cwb.googlegroups.com...
> http://msdn.microsoft.com/library/e...tomate_42r7.asp
> --
> David Portas
> SQL Server MVP
> --
>|||Use the RESTORE DATABASE command in a Transact SQL job step. See Books
Online for details of the RESTORE DATABASE command.
David Portas
SQL Server MVP
--|||"Patrick" <patl@.reply.newsgroup.msn.com> wrote in message
news:%23S0tVOhAFHA.3416@.TK2MSFTNGP09.phx.gbl...
> I have set up a Maintenance Plans on a SQL Server 2000 SP3a on one server
to
> create flat file backup of full databases to *.BAK files nightly.
> Is it possible to automate the restoring of such BAK files on another SQL
> Server 2000 SP3a on another server (assume I have in place scripts for
> copying the BAK files from the source server to the destination server)?
If
> so, how?
>
Yes.
In my case I wrote a stored proc on the restoring server and called it from
the backing up server.
CREATE procedure restore_FOO as
declare @.backup_file as varchar(255)
select @.backup_file=physical_device_name from
nell.msdb.dbo.backupmediafamily where media_set_id in (select
max(media_set_id) from BAR.msdb.dbo.backupset where database_name='foo')
print @.backup_file
restore database FOO from disk=@.backup_file with
move 'SearchActivity_Data' to 'e:\sql_data\FOO_data.mdf',
move 'SearchActivity_Log' to 'f:\SQL_LOGs\FOO_log.ldf',
move 'SearchActivity_Index' to 'g:\sql_index\FOO_Index_Data.NDF',
replace
GO

>

Thursday, March 8, 2012

Automated db mirroring

I have two database servers. My primary server contains the live database.
I'd like to set up an automated script program to take nightly snapshots of
the live database onto my secondary DB server.
How do I go about doing this?
What you are looking for is called Log Shipping. It is a built-in feature
of Enterprise Edition and can be 'bolted on' to other editions. There is a
simple example in the SQL Server Resource Kit you can adapt for your needs.
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
"Ed" <eddiemarino@.hotmail.com> wrote in message
news:OOT502ILEHA.3012@.tk2msftngp13.phx.gbl...
> I have two database servers. My primary server contains the live
database.
> I'd like to set up an automated script program to take nightly snapshots
of
> the live database onto my secondary DB server.
> How do I go about doing this?
>

Automated db mirroring

I have two database servers. My primary server contains the live database.
I'd like to set up an automated script program to take nightly snapshots of
the live database onto my secondary DB server.
How do I go about doing this?What you are looking for is called Log Shipping. It is a built-in feature
of Enterprise Edition and can be 'bolted on' to other editions. There is a
simple example in the SQL Server Resource Kit you can adapt for your needs.
--
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
"Ed" <eddiemarino@.hotmail.com> wrote in message
news:OOT502ILEHA.3012@.tk2msftngp13.phx.gbl...
> I have two database servers. My primary server contains the live
database.
> I'd like to set up an automated script program to take nightly snapshots
of
> the live database onto my secondary DB server.
> How do I go about doing this?
>

Automated db mirroring

I have two database servers. My primary server contains the live database.
I'd like to set up an automated script program to take nightly snapshots of
the live database onto my secondary DB server.
How do I go about doing this?What you are looking for is called Log Shipping. It is a built-in feature
of Enterprise Edition and can be 'bolted on' to other editions. There is a
simple example in the SQL Server Resource Kit you can adapt for your needs.
Geoff N. Hiten
Microsoft SQL Server MVP
Senior Database Administrator
Careerbuilder.com
"Ed" <eddiemarino@.hotmail.com> wrote in message
news:OOT502ILEHA.3012@.tk2msftngp13.phx.gbl...
> I have two database servers. My primary server contains the live
database.
> I'd like to set up an automated script program to take nightly snapshots
of
> the live database onto my secondary DB server.
> How do I go about doing this?
>

Wednesday, March 7, 2012

Automated Backup and restoring to a new machine

Hi all,
I am not sure if this is the correct place to ask this quesion. The problem
i am trying to solve is basically this. We do a nightly complete backup of
our database. The next day we copy the database to a different machine
restore it and run jobs on this database. I would like to automate this
process, but don't know the best solution.
We have 3 machines here. Server A is the main producation db machine, server
B is our backup machine and server C is where we restore the backed up DB and
run the jobs. Can someone please suggest what in theri opinion would be a
good solution for this. Thanks
Poorav
You might want to investigate log shipping for this functionality which
requires Enterprise Edition. However, you can use scripts off the Resource
Pack or off some internet sites to achieve the same effect if you only have
Standard Edition. The principle is not difficult - you have a backup job on
the principal server and you need a share to copy files to and a job on the
standby server to restore the database and subsequent log files('WITH NO
RECOVERY').
Rgds,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)

Saturday, February 25, 2012

Automate a restore through a backup

Hi all,
Regards SQL Server 2000.
I have a situation whereby the client takes a nightly backup of his
databases. The backups are placed on a network share. He has a laptop where
he has sql server installed and wants to update the laptop's databases from
the lastest backup of the day before.
Is there a way to automate (within enterprice manager or t-sql) the
restoration of a database through a backup file?
Thanks,
Ivan> Is there a way to automate (within enterprice manager or t-sql) the
> restoration of a database through a backup file?
Sure, you can execute a T-SQL restore script. You may need to specify the
MOVE option if you need to restore to different drives/folders than the
source database:
RESTORE DATABASE MyDatabase
FROM DISK='\\BackupServer\BackupShare\MyDatab
ase.bak'
WITH
MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
The restore runs in the OS security context of the SQL Server service
account when performed by a sysadmin role member so the service account will
need read permissions to the network backup share. Also, you can include
the restore scripts in a SQL Agent job to facilitate execution, either
scheduled or manual.
Hope this helps.
Dan Guzman
SQL Server MVP
"Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
> Hi all,
> Regards SQL Server 2000.
> I have a situation whereby the client takes a nightly backup of his
> databases. The backups are placed on a network share. He has a laptop
> where he has sql server installed and wants to update the laptop's
> databases from the lastest backup of the day before.
> Is there a way to automate (within enterprice manager or t-sql) the
> restoration of a database through a backup file?
> Thanks,
> Ivan
>|||"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
news:CBF1540E-BD97-436B-87DF-9FE24BD4EA6F@.microsoft.com...
> Sure, you can execute a T-SQL restore script. You may need to specify the
> MOVE option if you need to restore to different drives/folders than the
> source database:
> RESTORE DATABASE MyDatabase
> FROM DISK='\\BackupServer\BackupShare\MyDatab
ase.bak'
> WITH
> MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
> MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
> The restore runs in the OS security context of the SQL Server service
> account when performed by a sysadmin role member so the service account
> will need read permissions to the network backup share. Also, you can
> include the restore scripts in a SQL Agent job to facilitate execution,
> either scheduled or manual.
He may also want to look into straight log-shipping.
There's scripts for 'rolling your own'.
Rather than restore the full backup every night, he can restore just the
logs and put the DB into read-only mode if that'll work for him.
Even if he doesn't go that far, he can get "smarter" possibly and look in
the MSDB for the last backup set and automatically load that one.

> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
> news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
>
Greg Moore
SQL Server DBA Consulting Remote and Onsite available!
Email: sql (at) greenms.com http://www.greenms.com/sqlserver.html|||"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schrieb im Newsbeitrag
news:CBF1540E-BD97-436B-87DF-9FE24BD4EA6F@.microsoft.com...
> Sure, you can execute a T-SQL restore script. You may need to specify the
> MOVE option if you need to restore to different drives/folders than the
> source database:
> RESTORE DATABASE MyDatabase
> FROM DISK='\\BackupServer\BackupShare\MyDatab
ase.bak'
> WITH
> MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
> MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
> The restore runs in the OS security context of the SQL Server service
> account when performed by a sysadmin role member so the service account
> will need read permissions to the network backup share. Also, you can
> include the restore scripts in a SQL Agent job to facilitate execution,
> either scheduled or manual.
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
> news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
>
That's a great idea. One problem is that the backup job creates the backup
files with a timestamp in the name. Is there a way to exclude this or at
least get the latest backup copy automatically?
Thanks,
Ivan|||On Apr 4, 12:14 pm, "Ivan Debono" <ivanm...@.hotmail.com> wrote:
> "Dan Guzman" <guzma...@.nospam-online.sbcglobal.net> schrieb im Newsbeitrag
news:CBF1540E-BD97-436B-87DF-9FE24BD4EA6F@.microsoft.com...
>
>
>
>
>
>
>
>
>
>
>
>
>
> That's a great idea. One problem is that the backup job creates the backup
> files with a timestamp in the name. Is there a way to exclude this or at
> least get the latest backup copy automatically?
> Thanks,
> Ivan- Hide quoted text -
> - Show quoted text -
You can rename your latest backup file to a constant file name and
restore it from this.
If you don't want to change the name of the backup file use
xp_cmdshell 'dir ' and put into a table and parse to get the name of
the latest backup file|||"M A Srinivas" <masri999@.gmail.com> schrieb im Newsbeitrag
news:1175684363.282267.133330@.e65g2000hsc.googlegroups.com...
> On Apr 4, 12:14 pm, "Ivan Debono" <ivanm...@.hotmail.com> wrote:
> You can rename your latest backup file to a constant file name and
> restore it from this.
> If you don't want to change the name of the backup file use
> xp_cmdshell 'dir ' and put into a table and parse to get the name of
> the latest backup file
>
And how does one places the output into the table?
Ivan|||> And how does one places the output into the table?
You can use INSERT...EXEC. For example:
CREATE TABLE #FileList(output_line varchar(8000))
INSERT INTO #FileList
EXEC master..xp_cmdshell 'DIR C:\Backups'
However, I think parsing the output is a bit of a kludge. You might instead
consider using a VBScript that you can include as an ActiveX script job step
or a stand-alone vbs file. Below is a VBScript example you can tweak for
your needs.
'Restore from latest backup
Option Explicit
Const BackupFolder = "C:\Backups"
Const ConnectionString = "Provider=SQLOLEDB;Data Source=MyServer;Initial
Catalog=master;Integrated Security=SSPI"
Const RestoreScriptTemplate = "RESTORE DATABASE MyDatabase FROM
DISK='$(BackupFilePath)'"
Dim oFSO, BackupFilePath
Set oFSO = CreateObject("Scripting.FileSystemObject")
BackupFilePath = GetLatestBackupFile()
RestoreDatabase ConnectionString, BackupFilePath
Function GetLatestBackupFile()
Dim oFolder, oFile, LatestFilePath, LatestFileDateCreated
Set oFolder = oFSO.GetFolder(BackupFolder)
For Each oFile In oFolder.Files
If oFile.DateCreated > LatestFileDateCreated Then
LatestFileDateCreated = oFile.DateCreated
LatestFilePath = oFile.Path
End If
Next
GetLatestBackupFile = LatestFilePath
End Function
Sub RestoreDatabase(ConnectionString, BackupFilePath)
Dim RestoreScript, connection
RestoreScript = Replace(RestoreScriptTemplate, "$(BackupFilePath)",
BackupFilePath)
Set connection = CreateObject("ADODB.Connection")
connection.Open ConnectionString
connection.Execute RestoreScript
connection.Close
End Sub
Hope this helps.
Dan Guzman
SQL Server MVP
"Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
news:%239dlo%23qdHHA.4872@.TK2MSFTNGP03.phx.gbl...
> "M A Srinivas" <masri999@.gmail.com> schrieb im Newsbeitrag
> news:1175684363.282267.133330@.e65g2000hsc.googlegroups.com...
> And how does one places the output into the table?
> Ivan
>

Automate a restore through a backup

Hi all,
Regards SQL Server 2000.
I have a situation whereby the client takes a nightly backup of his
databases. The backups are placed on a network share. He has a laptop where
he has sql server installed and wants to update the laptop's databases from
the lastest backup of the day before.
Is there a way to automate (within enterprice manager or t-sql) the
restoration of a database through a backup file?
Thanks,
Ivan
> Is there a way to automate (within enterprice manager or t-sql) the
> restoration of a database through a backup file?
Sure, you can execute a T-SQL restore script. You may need to specify the
MOVE option if you need to restore to different drives/folders than the
source database:
RESTORE DATABASE MyDatabase
FROM DISK='\\BackupServer\BackupShare\MyDatabase.bak'
WITH
MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
The restore runs in the OS security context of the SQL Server service
account when performed by a sysadmin role member so the service account will
need read permissions to the network backup share. Also, you can include
the restore scripts in a SQL Agent job to facilitate execution, either
scheduled or manual.
Hope this helps.
Dan Guzman
SQL Server MVP
"Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
> Hi all,
> Regards SQL Server 2000.
> I have a situation whereby the client takes a nightly backup of his
> databases. The backups are placed on a network share. He has a laptop
> where he has sql server installed and wants to update the laptop's
> databases from the lastest backup of the day before.
> Is there a way to automate (within enterprice manager or t-sql) the
> restoration of a database through a backup file?
> Thanks,
> Ivan
>
|||"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
news:CBF1540E-BD97-436B-87DF-9FE24BD4EA6F@.microsoft.com...
> Sure, you can execute a T-SQL restore script. You may need to specify the
> MOVE option if you need to restore to different drives/folders than the
> source database:
> RESTORE DATABASE MyDatabase
> FROM DISK='\\BackupServer\BackupShare\MyDatabase.bak'
> WITH
> MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
> MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
> The restore runs in the OS security context of the SQL Server service
> account when performed by a sysadmin role member so the service account
> will need read permissions to the network backup share. Also, you can
> include the restore scripts in a SQL Agent job to facilitate execution,
> either scheduled or manual.
He may also want to look into straight log-shipping.
There's scripts for 'rolling your own'.
Rather than restore the full backup every night, he can restore just the
logs and put the DB into read-only mode if that'll work for him.
Even if he doesn't go that far, he can get "smarter" possibly and look in
the MSDB for the last backup set and automatically load that one.

> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
> news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
>
Greg Moore
SQL Server DBA Consulting Remote and Onsite available!
Email: sql (at) greenms.com http://www.greenms.com/sqlserver.html
|||"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schrieb im Newsbeitrag
news:CBF1540E-BD97-436B-87DF-9FE24BD4EA6F@.microsoft.com...
> Sure, you can execute a T-SQL restore script. You may need to specify the
> MOVE option if you need to restore to different drives/folders than the
> source database:
> RESTORE DATABASE MyDatabase
> FROM DISK='\\BackupServer\BackupShare\MyDatabase.bak'
> WITH
> MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
> MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
> The restore runs in the OS security context of the SQL Server service
> account when performed by a sysadmin role member so the service account
> will need read permissions to the network backup share. Also, you can
> include the restore scripts in a SQL Agent job to facilitate execution,
> either scheduled or manual.
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
> news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
>
That's a great idea. One problem is that the backup job creates the backup
files with a timestamp in the name. Is there a way to exclude this or at
least get the latest backup copy automatically?
Thanks,
Ivan
|||On Apr 4, 12:14 pm, "Ivan Debono" <ivanm...@.hotmail.com> wrote:
> "Dan Guzman" <guzma...@.nospam-online.sbcglobal.net> schrieb im Newsbeitragnews:CBF1540E-BD97-436B-87DF-9FE24BD4EA6F@.microsoft.com...
>
>
>
>
>
>
>
>
> That's a great idea. One problem is that the backup job creates the backup
> files with a timestamp in the name. Is there a way to exclude this or at
> least get the latest backup copy automatically?
> Thanks,
> Ivan- Hide quoted text -
> - Show quoted text -
You can rename your latest backup file to a constant file name and
restore it from this.
If you don't want to change the name of the backup file use
xp_cmdshell 'dir ' and put into a table and parse to get the name of
the latest backup file
|||"M A Srinivas" <masri999@.gmail.com> schrieb im Newsbeitrag
news:1175684363.282267.133330@.e65g2000hsc.googlegr oups.com...
> On Apr 4, 12:14 pm, "Ivan Debono" <ivanm...@.hotmail.com> wrote:
> You can rename your latest backup file to a constant file name and
> restore it from this.
> If you don't want to change the name of the backup file use
> xp_cmdshell 'dir ' and put into a table and parse to get the name of
> the latest backup file
>
And how does one places the output into the table?
Ivan
|||> And how does one places the output into the table?
You can use INSERT...EXEC. For example:
CREATE TABLE #FileList(output_line varchar(8000))
INSERT INTO #FileList
EXEC master..xp_cmdshell 'DIR C:\Backups'
However, I think parsing the output is a bit of a kludge. You might instead
consider using a VBScript that you can include as an ActiveX script job step
or a stand-alone vbs file. Below is a VBScript example you can tweak for
your needs.
'Restore from latest backup
Option Explicit
Const BackupFolder = "C:\Backups"
Const ConnectionString = "Provider=SQLOLEDB;Data Source=MyServer;Initial
Catalog=master;Integrated Security=SSPI"
Const RestoreScriptTemplate = "RESTORE DATABASE MyDatabase FROM
DISK='$(BackupFilePath)'"
Dim oFSO, BackupFilePath
Set oFSO = CreateObject("Scripting.FileSystemObject")
BackupFilePath = GetLatestBackupFile()
RestoreDatabase ConnectionString, BackupFilePath
Function GetLatestBackupFile()
Dim oFolder, oFile, LatestFilePath, LatestFileDateCreated
Set oFolder = oFSO.GetFolder(BackupFolder)
For Each oFile In oFolder.Files
If oFile.DateCreated > LatestFileDateCreated Then
LatestFileDateCreated = oFile.DateCreated
LatestFilePath = oFile.Path
End If
Next
GetLatestBackupFile = LatestFilePath
End Function
Sub RestoreDatabase(ConnectionString, BackupFilePath)
Dim RestoreScript, connection
RestoreScript = Replace(RestoreScriptTemplate, "$(BackupFilePath)",
BackupFilePath)
Set connection = CreateObject("ADODB.Connection")
connection.Open ConnectionString
connection.Execute RestoreScript
connection.Close
End Sub
Hope this helps.
Dan Guzman
SQL Server MVP
"Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
news:%239dlo%23qdHHA.4872@.TK2MSFTNGP03.phx.gbl...
> "M A Srinivas" <masri999@.gmail.com> schrieb im Newsbeitrag
> news:1175684363.282267.133330@.e65g2000hsc.googlegr oups.com...
> And how does one places the output into the table?
> Ivan
>

Automate a restore through a backup

Hi all,
Regards SQL Server 2000.
I have a situation whereby the client takes a nightly backup of his
databases. The backups are placed on a network share. He has a laptop where
he has sql server installed and wants to update the laptop's databases from
the lastest backup of the day before.
Is there a way to automate (within enterprice manager or t-sql) the
restoration of a database through a backup file?
Thanks,
Ivan> Is there a way to automate (within enterprice manager or t-sql) the
> restoration of a database through a backup file?
Sure, you can execute a T-SQL restore script. You may need to specify the
MOVE option if you need to restore to different drives/folders than the
source database:
RESTORE DATABASE MyDatabase
FROM DISK='\\BackupServer\BackupShare\MyDatabase.bak'
WITH
MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
The restore runs in the OS security context of the SQL Server service
account when performed by a sysadmin role member so the service account will
need read permissions to the network backup share. Also, you can include
the restore scripts in a SQL Agent job to facilitate execution, either
scheduled or manual.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
> Hi all,
> Regards SQL Server 2000.
> I have a situation whereby the client takes a nightly backup of his
> databases. The backups are placed on a network share. He has a laptop
> where he has sql server installed and wants to update the laptop's
> databases from the lastest backup of the day before.
> Is there a way to automate (within enterprice manager or t-sql) the
> restoration of a database through a backup file?
> Thanks,
> Ivan
>|||"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
news:CBF1540E-BD97-436B-87DF-9FE24BD4EA6F@.microsoft.com...
>> Is there a way to automate (within enterprice manager or t-sql) the
>> restoration of a database through a backup file?
> Sure, you can execute a T-SQL restore script. You may need to specify the
> MOVE option if you need to restore to different drives/folders than the
> source database:
> RESTORE DATABASE MyDatabase
> FROM DISK='\\BackupServer\BackupShare\MyDatabase.bak'
> WITH
> MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
> MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
> The restore runs in the OS security context of the SQL Server service
> account when performed by a sysadmin role member so the service account
> will need read permissions to the network backup share. Also, you can
> include the restore scripts in a SQL Agent job to facilitate execution,
> either scheduled or manual.
He may also want to look into straight log-shipping.
There's scripts for 'rolling your own'.
Rather than restore the full backup every night, he can restore just the
logs and put the DB into read-only mode if that'll work for him.
Even if he doesn't go that far, he can get "smarter" possibly and look in
the MSDB for the last backup set and automatically load that one.
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
> news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
>> Hi all,
>> Regards SQL Server 2000.
>> I have a situation whereby the client takes a nightly backup of his
>> databases. The backups are placed on a network share. He has a laptop
>> where he has sql server installed and wants to update the laptop's
>> databases from the lastest backup of the day before.
>> Is there a way to automate (within enterprice manager or t-sql) the
>> restoration of a database through a backup file?
>> Thanks,
>> Ivan
>>
>
--
Greg Moore
SQL Server DBA Consulting Remote and Onsite available!
Email: sql (at) greenms.com http://www.greenms.com/sqlserver.html|||"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schrieb im Newsbeitrag
news:CBF1540E-BD97-436B-87DF-9FE24BD4EA6F@.microsoft.com...
>> Is there a way to automate (within enterprice manager or t-sql) the
>> restoration of a database through a backup file?
> Sure, you can execute a T-SQL restore script. You may need to specify the
> MOVE option if you need to restore to different drives/folders than the
> source database:
> RESTORE DATABASE MyDatabase
> FROM DISK='\\BackupServer\BackupShare\MyDatabase.bak'
> WITH
> MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
> MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
> The restore runs in the OS security context of the SQL Server service
> account when performed by a sysadmin role member so the service account
> will need read permissions to the network backup share. Also, you can
> include the restore scripts in a SQL Agent job to facilitate execution,
> either scheduled or manual.
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
> news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
>> Hi all,
>> Regards SQL Server 2000.
>> I have a situation whereby the client takes a nightly backup of his
>> databases. The backups are placed on a network share. He has a laptop
>> where he has sql server installed and wants to update the laptop's
>> databases from the lastest backup of the day before.
>> Is there a way to automate (within enterprice manager or t-sql) the
>> restoration of a database through a backup file?
>> Thanks,
>> Ivan
>>
>
That's a great idea. One problem is that the backup job creates the backup
files with a timestamp in the name. Is there a way to exclude this or at
least get the latest backup copy automatically?
Thanks,
Ivan|||On Apr 4, 12:14 pm, "Ivan Debono" <ivanm...@.hotmail.com> wrote:
> "Dan Guzman" <guzma...@.nospam-online.sbcglobal.net> schrieb im Newsbeitragnews:CBF1540E-BD97-436B-87DF-9FE24BD4EA6F@.microsoft.com...
>
>
> >> Is there a way to automate (within enterprice manager or t-sql) the
> >> restoration of a database through a backup file?
> > Sure, you can execute a T-SQL restore script. You may need to specify the
> > MOVE option if you need to restore to different drives/folders than the
> > source database:
> > RESTORE DATABASE MyDatabase
> > FROM DISK='\\BackupServer\BackupShare\MyDatabase.bak'
> > WITH
> > MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
> > MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
> > The restore runs in the OS security context of the SQL Server service
> > account when performed by a sysadmin role member so the service account
> > will need read permissions to the network backup share. Also, you can
> > include the restore scripts in a SQL Agent job to facilitate execution,
> > either scheduled or manual.
> > --
> > Hope this helps.
> > Dan Guzman
> > SQL Server MVP
> > "Ivan Debono" <ivanm...@.hotmail.com> wrote in message
> >news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
> >> Hi all,
> >> Regards SQL Server 2000.
> >> I have a situation whereby the client takes a nightly backup of his
> >> databases. The backups are placed on a network share. He has a laptop
> >> where he has sql server installed and wants to update the laptop's
> >> databases from the lastest backup of the day before.
> >> Is there a way to automate (within enterprice manager or t-sql) the
> >> restoration of a database through a backup file?
> >> Thanks,
> >> Ivan
> That's a great idea. One problem is that the backup job creates the backup
> files with a timestamp in the name. Is there a way to exclude this or at
> least get the latest backup copy automatically?
> Thanks,
> Ivan- Hide quoted text -
> - Show quoted text -
You can rename your latest backup file to a constant file name and
restore it from this.
If you don't want to change the name of the backup file use
xp_cmdshell 'dir ' and put into a table and parse to get the name of
the latest backup file|||"M A Srinivas" <masri999@.gmail.com> schrieb im Newsbeitrag
news:1175684363.282267.133330@.e65g2000hsc.googlegroups.com...
> On Apr 4, 12:14 pm, "Ivan Debono" <ivanm...@.hotmail.com> wrote:
>> "Dan Guzman" <guzma...@.nospam-online.sbcglobal.net> schrieb im
>> Newsbeitragnews:CBF1540E-BD97-436B-87DF-9FE24BD4EA6F@.microsoft.com...
>>
>>
>> >> Is there a way to automate (within enterprice manager or t-sql) the
>> >> restoration of a database through a backup file?
>> > Sure, you can execute a T-SQL restore script. You may need to specify
>> > the
>> > MOVE option if you need to restore to different drives/folders than the
>> > source database:
>> > RESTORE DATABASE MyDatabase
>> > FROM DISK='\\BackupServer\BackupShare\MyDatabase.bak'
>> > WITH
>> > MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
>> > MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
>> > The restore runs in the OS security context of the SQL Server service
>> > account when performed by a sysadmin role member so the service account
>> > will need read permissions to the network backup share. Also, you can
>> > include the restore scripts in a SQL Agent job to facilitate execution,
>> > either scheduled or manual.
>> > --
>> > Hope this helps.
>> > Dan Guzman
>> > SQL Server MVP
>> > "Ivan Debono" <ivanm...@.hotmail.com> wrote in message
>> >news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
>> >> Hi all,
>> >> Regards SQL Server 2000.
>> >> I have a situation whereby the client takes a nightly backup of his
>> >> databases. The backups are placed on a network share. He has a laptop
>> >> where he has sql server installed and wants to update the laptop's
>> >> databases from the lastest backup of the day before.
>> >> Is there a way to automate (within enterprice manager or t-sql) the
>> >> restoration of a database through a backup file?
>> >> Thanks,
>> >> Ivan
>> That's a great idea. One problem is that the backup job creates the
>> backup
>> files with a timestamp in the name. Is there a way to exclude this or at
>> least get the latest backup copy automatically?
>> Thanks,
>> Ivan- Hide quoted text -
>> - Show quoted text -
> You can rename your latest backup file to a constant file name and
> restore it from this.
> If you don't want to change the name of the backup file use
> xp_cmdshell 'dir ' and put into a table and parse to get the name of
> the latest backup file
>
And how does one places the output into the table?
Ivan|||> And how does one places the output into the table?
You can use INSERT...EXEC. For example:
CREATE TABLE #FileList(output_line varchar(8000))
INSERT INTO #FileList
EXEC master..xp_cmdshell 'DIR C:\Backups'
However, I think parsing the output is a bit of a kludge. You might instead
consider using a VBScript that you can include as an ActiveX script job step
or a stand-alone vbs file. Below is a VBScript example you can tweak for
your needs.
'Restore from latest backup
Option Explicit
Const BackupFolder = "C:\Backups"
Const ConnectionString = "Provider=SQLOLEDB;Data Source=MyServer;Initial
Catalog=master;Integrated Security=SSPI"
Const RestoreScriptTemplate = "RESTORE DATABASE MyDatabase FROM
DISK='$(BackupFilePath)'"
Dim oFSO, BackupFilePath
Set oFSO = CreateObject("Scripting.FileSystemObject")
BackupFilePath = GetLatestBackupFile()
RestoreDatabase ConnectionString, BackupFilePath
Function GetLatestBackupFile()
Dim oFolder, oFile, LatestFilePath, LatestFileDateCreated
Set oFolder = oFSO.GetFolder(BackupFolder)
For Each oFile In oFolder.Files
If oFile.DateCreated > LatestFileDateCreated Then
LatestFileDateCreated = oFile.DateCreated
LatestFilePath = oFile.Path
End If
Next
GetLatestBackupFile = LatestFilePath
End Function
Sub RestoreDatabase(ConnectionString, BackupFilePath)
Dim RestoreScript, connection
RestoreScript = Replace(RestoreScriptTemplate, "$(BackupFilePath)",
BackupFilePath)
Set connection = CreateObject("ADODB.Connection")
connection.Open ConnectionString
connection.Execute RestoreScript
connection.Close
End Sub
--
Hope this helps.
Dan Guzman
SQL Server MVP
"Ivan Debono" <ivanmdeb@.hotmail.com> wrote in message
news:%239dlo%23qdHHA.4872@.TK2MSFTNGP03.phx.gbl...
> "M A Srinivas" <masri999@.gmail.com> schrieb im Newsbeitrag
> news:1175684363.282267.133330@.e65g2000hsc.googlegroups.com...
>> On Apr 4, 12:14 pm, "Ivan Debono" <ivanm...@.hotmail.com> wrote:
>> "Dan Guzman" <guzma...@.nospam-online.sbcglobal.net> schrieb im
>> Newsbeitragnews:CBF1540E-BD97-436B-87DF-9FE24BD4EA6F@.microsoft.com...
>>
>>
>> >> Is there a way to automate (within enterprice manager or t-sql) the
>> >> restoration of a database through a backup file?
>> > Sure, you can execute a T-SQL restore script. You may need to specify
>> > the
>> > MOVE option if you need to restore to different drives/folders than
>> > the
>> > source database:
>> > RESTORE DATABASE MyDatabase
>> > FROM DISK='\\BackupServer\BackupShare\MyDatabase.bak'
>> > WITH
>> > MOVE 'MyDatabase' TO 'C:\DataFiles\MyDatabase.mdf',
>> > MOVE 'MyDatabase_Log' TO 'C:\LogFiles\MyDatabase_Log.ldf'
>> > The restore runs in the OS security context of the SQL Server service
>> > account when performed by a sysadmin role member so the service
>> > account
>> > will need read permissions to the network backup share. Also, you can
>> > include the restore scripts in a SQL Agent job to facilitate
>> > execution,
>> > either scheduled or manual.
>> > --
>> > Hope this helps.
>> > Dan Guzman
>> > SQL Server MVP
>> > "Ivan Debono" <ivanm...@.hotmail.com> wrote in message
>> >news:eNh3EBedHHA.4188@.TK2MSFTNGP02.phx.gbl...
>> >> Hi all,
>> >> Regards SQL Server 2000.
>> >> I have a situation whereby the client takes a nightly backup of his
>> >> databases. The backups are placed on a network share. He has a laptop
>> >> where he has sql server installed and wants to update the laptop's
>> >> databases from the lastest backup of the day before.
>> >> Is there a way to automate (within enterprice manager or t-sql) the
>> >> restoration of a database through a backup file?
>> >> Thanks,
>> >> Ivan
>> That's a great idea. One problem is that the backup job creates the
>> backup
>> files with a timestamp in the name. Is there a way to exclude this or at
>> least get the latest backup copy automatically?
>> Thanks,
>> Ivan- Hide quoted text -
>> - Show quoted text -
>> You can rename your latest backup file to a constant file name and
>> restore it from this.
>> If you don't want to change the name of the backup file use
>> xp_cmdshell 'dir ' and put into a table and parse to get the name of
>> the latest backup file
> And how does one places the output into the table?
> Ivan
>

Automate a nightly query and email results...

Hi,
I was wondering if someone could help with my latest project.
I need to run a nightly query on our MS SQL 2000 DB, have the results saved
to a text file and then email the file to someone. I have no idea were to
begin with this one...help?
Thanks,
MitchYou can create a Data Transformation Services package to create the file
based on the query and then email the file. Schedule the package to run ever
y
night.
Data Transformation Services (DTS) in
Microsoft SQL Server 2000
http://msdn.microsoft.com/sql/sqlwa...ts_overview.asp
Data Transformation Services
http://msdn.microsoft.com/library/d...asp?frame=true
AMB
"mitch" wrote:

> Hi,
> I was wondering if someone could help with my latest project.
> I need to run a nightly query on our MS SQL 2000 DB, have the results save
d
> to a text file and then email the file to someone. I have no idea were to
> begin with this one...help?
> Thanks,
> Mitch
>
>|||I would do this using a DTS package or using BCP.
For example, let's go with a DTS package:
In the package, you will add an SQL Server source, and flat file destination
connections. You add a data pump task that connects these two connections,
and pumps out your query results to the flat file.
Then the next step will send an email by specifying the file name as the
attachment.
For sending emails, I use xp_smtp_sendmail, which is a free download from
http://sqldev.net
You could also use SQL Mail. See SQL Server Books Online for more
information.
if you are not familiar with DTS, start with Books Online, and play around
with it from the DTS designer in Enterprise manager. For DTS info, checkout
sqldts.com.
--
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"mitch" <mitch_001@.REMOVEyahoo.com> wrote in message
news:eUiUKWSnFHA.576@.TK2MSFTNGP15.phx.gbl...
> Hi,
> I was wondering if someone could help with my latest project.
> I need to run a nightly query on our MS SQL 2000 DB, have the results
> saved to a text file and then email the file to someone. I have no idea
> were to begin with this one...help?
> Thanks,
> Mitch
>|||Hi,
If u wanted to transfer all data to a text file then use bcp utility
ex:
exec master..xp_cmdshell 'bcp epinav.dbo.dlvmode out
D:\scriptrecordtable\dlvmode.bcp -n -"EPIOLAPP428G" -U -P'
epiolapp428g is the servername
for more help read books online just type BCP
if u wanted selected data then create a view and tranfer data from
view.
with the same as above.
for sending mail i refer to vyas solution.
to copy data from text file to ur sql table
use this
exec master..xp_cmdshell 'bcp epinav.dbo.custtable in
E:\RepScript\custtable.bcp -n -"aicml370" -U -P'
hope this help
from
killer|||Thanks everyone!!!
I started playing around with DTS and was able to setup a query and save the
info to a text file, but now I have 2 more question.
The information is saved in the text file row by row, BUT it is inserting a
space between each row, is there some way to remove the space?
Also, when a record is submitted it has to follow a YYYYMMDD format so I
placed a hidden text field in the form like this one:
<input name="source_date" type="hidden" id="source_date" value="<%
response.write YEAR(Date()) & _
Pd(Month(date()),2) & _
Pd(DAY(date()),2)
%>">
I am having some trouble in the query that runs in DTS to automatically
capture today's date and use it in the WHERE clause:
where [request_info_form].[source_date]='20050113'
Can you guys help me think of a way?
Thank you very much for all the help!
Mitch
"Narayana Vyas Kondreddi" <answer_me@.hotmail.com> wrote in message
news:%238YBLiSnFHA.2156@.TK2MSFTNGP14.phx.gbl...
>I would do this using a DTS package or using BCP.
> For example, let's go with a DTS package:
> In the package, you will add an SQL Server source, and flat file
> destination connections. You add a data pump task that connects these two
> connections, and pumps out your query results to the flat file.
> Then the next step will send an email by specifying the file name as the
> attachment.
> For sending emails, I use xp_smtp_sendmail, which is a free download from
> http://sqldev.net
> You could also use SQL Mail. See SQL Server Books Online for more
> information.
> if you are not familiar with DTS, start with Books Online, and play around
> with it from the DTS designer in Enterprise manager. For DTS info,
> checkout sqldts.com.
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "mitch" <mitch_001@.REMOVEyahoo.com> wrote in message
> news:eUiUKWSnFHA.576@.TK2MSFTNGP15.phx.gbl...
>|||Something like this should work...
select @.yourdate = (select convert(CHAR(8),getdate(),112))
This puts things in yyyymmdd format.|||Works perfectly!!! Thanks!
<unc27932@.yahoo.com> wrote in message
news:1123689630.540990.188690@.z14g2000cwz.googlegroups.com...
> Something like this should work...
> select @.yourdate = (select convert(CHAR(8),getdate(),112))
> This puts things in yyyymmdd format.
>

Automate a nightly query and email results...

Hi,
I was wondering if someone could help with my latest project.
I need to run a nightly query on our MS SQL 2000 DB, have the results saved
to a text file and then email the file to someone. I have no idea were to
begin with this one...help?
Thanks,
Mitch
You can create a Data Transformation Services package to create the file
based on the query and then email the file. Schedule the package to run every
night.
Data Transformation Services (DTS) in
Microsoft SQL Server 2000
http://msdn.microsoft.com/sql/sqlwar...s_overview.asp
Data Transformation Services
http://msdn.microsoft.com/library/de...asp?frame=true
AMB
"mitch" wrote:

> Hi,
> I was wondering if someone could help with my latest project.
> I need to run a nightly query on our MS SQL 2000 DB, have the results saved
> to a text file and then email the file to someone. I have no idea were to
> begin with this one...help?
> Thanks,
> Mitch
>
>
|||I would do this using a DTS package or using BCP.
For example, let's go with a DTS package:
In the package, you will add an SQL Server source, and flat file destination
connections. You add a data pump task that connects these two connections,
and pumps out your query results to the flat file.
Then the next step will send an email by specifying the file name as the
attachment.
For sending emails, I use xp_smtp_sendmail, which is a free download from
http://sqldev.net
You could also use SQL Mail. See SQL Server Books Online for more
information.
if you are not familiar with DTS, start with Books Online, and play around
with it from the DTS designer in Enterprise manager. For DTS info, checkout
sqldts.com.
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"mitch" <mitch_001@.REMOVEyahoo.com> wrote in message
news:eUiUKWSnFHA.576@.TK2MSFTNGP15.phx.gbl...
> Hi,
> I was wondering if someone could help with my latest project.
> I need to run a nightly query on our MS SQL 2000 DB, have the results
> saved to a text file and then email the file to someone. I have no idea
> were to begin with this one...help?
> Thanks,
> Mitch
>
|||Hi,
If u wanted to transfer all data to a text file then use bcp utility
ex:
exec master..xp_cmdshell 'bcp epinav.dbo.dlvmode out
D:\scriptrecordtable\dlvmode.bcp -n -"EPIOLAPP428G" -U -P'
epiolapp428g is the servername
for more help read books online just type BCP
if u wanted selected data then create a view and tranfer data from
view.
with the same as above.
for sending mail i refer to vyas solution.
to copy data from text file to ur sql table
use this
exec master..xp_cmdshell 'bcp epinav.dbo.custtable in
E:\RepScript\custtable.bcp -n -"aicml370" -U -P'
hope this help
from
killer
|||Thanks everyone!!!
I started playing around with DTS and was able to setup a query and save the
info to a text file, but now I have 2 more question.
The information is saved in the text file row by row, BUT it is inserting a
space between each row, is there some way to remove the space?
Also, when a record is submitted it has to follow a YYYYMMDD format so I
placed a hidden text field in the form like this one:
<input name="source_date" type="hidden" id="source_date" value="<%
response.write YEAR(Date()) & _
Pd(Month(date()),2) & _
Pd(DAY(date()),2)
%>">
I am having some trouble in the query that runs in DTS to automatically
capture today's date and use it in the WHERE clause:
where [request_info_form].[source_date]='20050113'
Can you guys help me think of a way?
Thank you very much for all the help!
Mitch
"Narayana Vyas Kondreddi" <answer_me@.hotmail.com> wrote in message
news:%238YBLiSnFHA.2156@.TK2MSFTNGP14.phx.gbl...
>I would do this using a DTS package or using BCP.
> For example, let's go with a DTS package:
> In the package, you will add an SQL Server source, and flat file
> destination connections. You add a data pump task that connects these two
> connections, and pumps out your query results to the flat file.
> Then the next step will send an email by specifying the file name as the
> attachment.
> For sending emails, I use xp_smtp_sendmail, which is a free download from
> http://sqldev.net
> You could also use SQL Mail. See SQL Server Books Online for more
> information.
> if you are not familiar with DTS, start with Books Online, and play around
> with it from the DTS designer in Enterprise manager. For DTS info,
> checkout sqldts.com.
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "mitch" <mitch_001@.REMOVEyahoo.com> wrote in message
> news:eUiUKWSnFHA.576@.TK2MSFTNGP15.phx.gbl...
>
|||Something like this should work...
select @.yourdate = (select convert(CHAR(8),getdate(),112))
This puts things in yyyymmdd format.
|||Works perfectly!!! Thanks!
<unc27932@.yahoo.com> wrote in message
news:1123689630.540990.188690@.z14g2000cwz.googlegr oups.com...
> Something like this should work...
> select @.yourdate = (select convert(CHAR(8),getdate(),112))
> This puts things in yyyymmdd format.
>

Automate a nightly query and email results...

Hi,
I was wondering if someone could help with my latest project.
I need to run a nightly query on our MS SQL 2000 DB, have the results saved
to a text file and then email the file to someone. I have no idea were to
begin with this one...help?
Thanks,
MitchYou can create a Data Transformation Services package to create the file
based on the query and then email the file. Schedule the package to run every
night.
Data Transformation Services (DTS) in
Microsoft SQL Server 2000
http://msdn.microsoft.com/sql/sqlwarehouse/dts/default.aspx?pull=/library/en-us/dnsql2k/html/dts_overview.asp
Data Transformation Services
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dtssql/dts_basic_5zg3.asp?frame=true
AMB
"mitch" wrote:
> Hi,
> I was wondering if someone could help with my latest project.
> I need to run a nightly query on our MS SQL 2000 DB, have the results saved
> to a text file and then email the file to someone. I have no idea were to
> begin with this one...help?
> Thanks,
> Mitch
>
>|||I would do this using a DTS package or using BCP.
For example, let's go with a DTS package:
In the package, you will add an SQL Server source, and flat file destination
connections. You add a data pump task that connects these two connections,
and pumps out your query results to the flat file.
Then the next step will send an email by specifying the file name as the
attachment.
For sending emails, I use xp_smtp_sendmail, which is a free download from
http://sqldev.net
You could also use SQL Mail. See SQL Server Books Online for more
information.
if you are not familiar with DTS, start with Books Online, and play around
with it from the DTS designer in Enterprise manager. For DTS info, checkout
sqldts.com.
--
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"mitch" <mitch_001@.REMOVEyahoo.com> wrote in message
news:eUiUKWSnFHA.576@.TK2MSFTNGP15.phx.gbl...
> Hi,
> I was wondering if someone could help with my latest project.
> I need to run a nightly query on our MS SQL 2000 DB, have the results
> saved to a text file and then email the file to someone. I have no idea
> were to begin with this one...help?
> Thanks,
> Mitch
>|||Hi,
If u wanted to transfer all data to a text file then use bcp utility
ex:
exec master..xp_cmdshell 'bcp epinav.dbo.dlvmode out
D:\scriptrecordtable\dlvmode.bcp -n -"EPIOLAPP428G" -U -P'
epiolapp428g is the servername
for more help read books online just type BCP
if u wanted selected data then create a view and tranfer data from
view.
with the same as above.
for sending mail i refer to vyas solution.
to copy data from text file to ur sql table
use this
exec master..xp_cmdshell 'bcp epinav.dbo.custtable in
E:\RepScript\custtable.bcp -n -"aicml370" -U -P'
hope this help
from
killer|||Thanks everyone!!!
I started playing around with DTS and was able to setup a query and save the
info to a text file, but now I have 2 more question.
The information is saved in the text file row by row, BUT it is inserting a
space between each row, is there some way to remove the space?
Also, when a record is submitted it has to follow a YYYYMMDD format so I
placed a hidden text field in the form like this one:
<input name="source_date" type="hidden" id="source_date" value="<%
response.write YEAR(Date()) & _
Pd(Month(date()),2) & _
Pd(DAY(date()),2)
%>">
I am having some trouble in the query that runs in DTS to automatically
capture today's date and use it in the WHERE clause:
where [request_info_form].[source_date]='20050113'
Can you guys help me think of a way?
Thank you very much for all the help!
Mitch
"Narayana Vyas Kondreddi" <answer_me@.hotmail.com> wrote in message
news:%238YBLiSnFHA.2156@.TK2MSFTNGP14.phx.gbl...
>I would do this using a DTS package or using BCP.
> For example, let's go with a DTS package:
> In the package, you will add an SQL Server source, and flat file
> destination connections. You add a data pump task that connects these two
> connections, and pumps out your query results to the flat file.
> Then the next step will send an email by specifying the file name as the
> attachment.
> For sending emails, I use xp_smtp_sendmail, which is a free download from
> http://sqldev.net
> You could also use SQL Mail. See SQL Server Books Online for more
> information.
> if you are not familiar with DTS, start with Books Online, and play around
> with it from the DTS designer in Enterprise manager. For DTS info,
> checkout sqldts.com.
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "mitch" <mitch_001@.REMOVEyahoo.com> wrote in message
> news:eUiUKWSnFHA.576@.TK2MSFTNGP15.phx.gbl...
>> Hi,
>> I was wondering if someone could help with my latest project.
>> I need to run a nightly query on our MS SQL 2000 DB, have the results
>> saved to a text file and then email the file to someone. I have no idea
>> were to begin with this one...help?
>> Thanks,
>> Mitch
>|||Something like this should work...
select @.yourdate = (select convert(CHAR(8),getdate(),112))
This puts things in yyyymmdd format.|||Works perfectly!!! Thanks!
<unc27932@.yahoo.com> wrote in message
news:1123689630.540990.188690@.z14g2000cwz.googlegroups.com...
> Something like this should work...
> select @.yourdate = (select convert(CHAR(8),getdate(),112))
> This puts things in yyyymmdd format.
>

Thursday, February 16, 2012

Auto shrink not working as expected

I'll be the first to admin, I am not a sql expert. I have a nightly job
that backs up the tran logs on my db's and the box is checked to autoshrink
the tran log when it exceeds 100 mb. There are no errors and the shrink is
executed as shown in the log reports but the size doesn't appear to change.
What is really stumping me is if I do it manually, I have to do a tran log
backup, shrink the db. This results in a few mb shrinkage. If I then go
back and do the same thing again (This is consistent on four major DB's on
this server) tran log backup followed by a shrink db it then shrinks the
tranlog as expected. Several of these db's aren't activily being updated at
the backup/shrink time so records aren't being inserted (At least not that I
am aware of) at the time.
Is there something I am not doing or an idea someone may have to do this?
The backup and shrink are being handled via the setup by Enterprise Manager.
Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
http://www.pbbergs.com
Please no e-mails, any questions should be posted in the NewsGroup
This posting is provided "AS IS" with no warranties, and confers no rights.An obvious question is why do you think you need to shrink the files every
day? If they grow every day then all you're doing is slowing down your
database every day because it has to grow the file. Do you tear down your
garage every time you back your car out and build it again when you come
home? This is about the same logic as shrinking the database and log files
daily. You should only shrink the files when you know they aren't going to
grow again.
This posting is provided "AS IS" with no warranties, and confers no rights.
Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm
"Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
news:%23CHDBP%23vGHA.4512@.TK2MSFTNGP05.phx.gbl...
> I'll be the first to admin, I am not a sql expert. I have a nightly job
> that backs up the tran logs on my db's and the box is checked to
> autoshrink the tran log when it exceeds 100 mb. There are no errors and
> the shrink is executed as shown in the log reports but the size doesn't
> appear to change. What is really stumping me is if I do it manually, I
> have to do a tran log backup, shrink the db. This results in a few mb
> shrinkage. If I then go back and do the same thing again (This is
> consistent on four major DB's on this server) tran log backup followed by
> a shrink db it then shrinks the tranlog as expected. Several of these
> db's aren't activily being updated at the backup/shrink time so records
> aren't being inserted (At least not that I am aware of) at the time.
> Is there something I am not doing or an idea someone may have to do this?
> The backup and shrink are being handled via the setup by Enterprise
> Manager.
> --
> Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
> http://www.pbbergs.com
> Please no e-mails, any questions should be posted in the NewsGroup
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
>|||Roger is 100% correct but these may be of interest to you as well:
http://www.karaszi.com/SQLServer/info_dont_shrink.asp Shrinking
considerations
http://www.nigelrivett.net/Transact...ileGrows_1.html Log File issues
http://www.support.microsoft.com/?id=317375 Log File Grows too big
http://www.support.microsoft.com/?id=110139 Log file filling up
http://www.support.microsoft.com/?id=315512 Considerations for Autogrow
and AutoShrink
http://www.support.microsoft.com/?id=272318 Shrinking Log in SQL Server
2000 with DBCC SHRINKFILE
http://www.support.microsoft.com/?id=873235 How to stop the log file from
growing
http://www.support.microsoft.com/?id=305635 Timeout while DB expanding
http://www.support.microsoft.com/?id=307487 Shrinking TempDB
Andrew J. Kelly SQL MVP
"Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
news:%23CHDBP%23vGHA.4512@.TK2MSFTNGP05.phx.gbl...
> I'll be the first to admin, I am not a sql expert. I have a nightly job
> that backs up the tran logs on my db's and the box is checked to
> autoshrink the tran log when it exceeds 100 mb. There are no errors and
> the shrink is executed as shown in the log reports but the size doesn't
> appear to change. What is really stumping me is if I do it manually, I
> have to do a tran log backup, shrink the db. This results in a few mb
> shrinkage. If I then go back and do the same thing again (This is
> consistent on four major DB's on this server) tran log backup followed by
> a shrink db it then shrinks the tranlog as expected. Several of these
> db's aren't activily being updated at the backup/shrink time so records
> aren't being inserted (At least not that I am aware of) at the time.
> Is there something I am not doing or an idea someone may have to do this?
> The backup and shrink are being handled via the setup by Enterprise
> Manager.
> --
> Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
> http://www.pbbergs.com
> Please no e-mails, any questions should be posted in the NewsGroup
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
>|||The transaction log backup is nightly, the attempted shrinking is weekly. I
don't believe a tran log needs to be 10 gig in size, considering the data
file is only 2 gig in size. I would have thought the nightly tran log back
up would force the tran log to re-use from the truncation point forward but
that doesn't seem to be occuring. The goal is to keep the tran log the max
log that could be used in a day, which is almost 2 gig in size.
Am I doing some thing wrong? Probably, but what I don't know.
Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
http://www.pbbergs.com
Please no e-mails, any questions should be posted in the NewsGroup
This posting is provided "AS IS" with no warranties, and confers no rights.
"Roger Wolter[MSFT]" <rwolter@.online.microsoft.com> wrote in message
news:e%23oT4sCwGHA.1436@.TK2MSFTNGP02.phx.gbl...
> An obvious question is why do you think you need to shrink the files every
> day? If they grow every day then all you're doing is slowing down your
> database every day because it has to grow the file. Do you tear down your
> garage every time you back your car out and build it again when you come
> home? This is about the same logic as shrinking the database and log
> files daily. You should only shrink the files when you know they aren't
> going to grow again.
> --
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
> Use of included script samples are subject to the terms specified at
> http://www.microsoft.com/info/cpyright.htm
> "Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
> news:%23CHDBP%23vGHA.4512@.TK2MSFTNGP05.phx.gbl...
>|||Paul
You don't need to shrink the log , because it is meaningless as it will be
grown againg and again. One option is set up the log file with an appropiate
size that it does not need to grow frequently .
"Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
news:%23yRZMdGwGHA.1272@.TK2MSFTNGP05.phx.gbl...
> The transaction log backup is nightly, the attempted shrinking is weekly.
> I don't believe a tran log needs to be 10 gig in size, considering the
> data file is only 2 gig in size. I would have thought the nightly tran
> log back up would force the tran log to re-use from the truncation point
> forward but that doesn't seem to be occuring. The goal is to keep the
> tran log the max log that could be used in a day, which is almost 2 gig in
> size.
> Am I doing some thing wrong? Probably, but what I don't know.
> --
> Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
> http://www.pbbergs.com
> Please no e-mails, any questions should be posted in the NewsGroup
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
> "Roger Wolter[MSFT]" <rwolter@.online.microsoft.com> wrote in message
> news:e%23oT4sCwGHA.1436@.TK2MSFTNGP02.phx.gbl...
>|||I don't think people understand my predicament.
I have four specific db's on my sql server 2000 server. I run these in Full
Recovery mode with nightly tran log and weekly full back ups. The log file
in some instances is more than 5 times the size of the db. I find it hard
to believe that this would be considered normal since a nightly job would
never have more info than the db itself.
If you can provide details as to why this is normal, please do.
Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
http://www.pbbergs.com
Please no e-mails, any questions should be posted in the NewsGroup
This posting is provided "AS IS" with no warranties, and confers no rights.
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:%238Ke8jGwGHA.4444@.TK2MSFTNGP05.phx.gbl...
> Paul
> You don't need to shrink the log , because it is meaningless as it will
> be grown againg and again. One option is set up the log file with an
> appropiate size that it does not need to grow frequently .
>
> "Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
> news:%23yRZMdGwGHA.1272@.TK2MSFTNGP05.phx.gbl...
>|||Paul Bergson wrote:
> I don't think people understand my predicament.
> I have four specific db's on my sql server 2000 server. I run these in Fu
ll
> Recovery mode with nightly tran log and weekly full back ups. The log fil
e
> in some instances is more than 5 times the size of the db. I find it hard
> to believe that this would be considered normal since a nightly job would
> never have more info than the db itself.
> If you can provide details as to why this is normal, please do.
>
Are you doing something like rebuilding indexes at night? Large imports?
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||Imports can be large
Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
http://www.pbbergs.com
Please no e-mails, any questions should be posted in the NewsGroup
This posting is provided "AS IS" with no warranties, and confers no rights.
"Tracy McKibben" <tracy@.realsqlguy.com> wrote in message
news:uKZQv8GwGHA.4296@.TK2MSFTNGP06.phx.gbl...
> Paul Bergson wrote:
> Are you doing something like rebuilding indexes at night? Large imports?
>
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com|||Paul Bergson wrote:
> Imports can be large
>
Ok, that could explain the large transaction log file. Say you're
importing 100,000 new rows of data, all as one transaction. The
transaction log has to be able to hold that entire transaction, in case
it has to roll back.
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||This may turn out to be a mute point. I'm on the phone with the vendor and
they are some how using the Log Files to store log history. It sounds like
it is unrelated to the actual logs that are need for roll back. I don't get
it. I need to get more info if this is the case. It sounds to me like they
have a configuration option which could allow me to control history kept
within this.
Maybe this is normal use, seems odd to me though.
Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
http://www.pbbergs.com
Please no e-mails, any questions should be posted in the NewsGroup
This posting is provided "AS IS" with no warranties, and confers no rights.
"Tracy McKibben" <tracy@.realsqlguy.com> wrote in message
news:O57rkXHwGHA.1224@.TK2MSFTNGP03.phx.gbl...
> Paul Bergson wrote:
> Ok, that could explain the large transaction log file. Say you're
> importing 100,000 new rows of data, all as one transaction. The
> transaction log has to be able to hold that entire transaction, in case it
> has to roll back.
>
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com

Auto shrink not working as expected

I'll be the first to admin, I am not a sql expert. I have a nightly job
that backs up the tran logs on my db's and the box is checked to autoshrink
the tran log when it exceeds 100 mb. There are no errors and the shrink is
executed as shown in the log reports but the size doesn't appear to change.
What is really stumping me is if I do it manually, I have to do a tran log
backup, shrink the db. This results in a few mb shrinkage. If I then go
back and do the same thing again (This is consistent on four major DB's on
this server) tran log backup followed by a shrink db it then shrinks the
tranlog as expected. Several of these db's aren't activily being updated at
the backup/shrink time so records aren't being inserted (At least not that I
am aware of) at the time.
Is there something I am not doing or an idea someone may have to do this?
The backup and shrink are being handled via the setup by Enterprise Manager.
--
Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
http://www.pbbergs.com
Please no e-mails, any questions should be posted in the NewsGroup
This posting is provided "AS IS" with no warranties, and confers no rights.An obvious question is why do you think you need to shrink the files every
day? If they grow every day then all you're doing is slowing down your
database every day because it has to grow the file. Do you tear down your
garage every time you back your car out and build it again when you come
home? This is about the same logic as shrinking the database and log files
daily. You should only shrink the files when you know they aren't going to
grow again.
--
This posting is provided "AS IS" with no warranties, and confers no rights.
Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm
"Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
news:%23CHDBP%23vGHA.4512@.TK2MSFTNGP05.phx.gbl...
> I'll be the first to admin, I am not a sql expert. I have a nightly job
> that backs up the tran logs on my db's and the box is checked to
> autoshrink the tran log when it exceeds 100 mb. There are no errors and
> the shrink is executed as shown in the log reports but the size doesn't
> appear to change. What is really stumping me is if I do it manually, I
> have to do a tran log backup, shrink the db. This results in a few mb
> shrinkage. If I then go back and do the same thing again (This is
> consistent on four major DB's on this server) tran log backup followed by
> a shrink db it then shrinks the tranlog as expected. Several of these
> db's aren't activily being updated at the backup/shrink time so records
> aren't being inserted (At least not that I am aware of) at the time.
> Is there something I am not doing or an idea someone may have to do this?
> The backup and shrink are being handled via the setup by Enterprise
> Manager.
> --
> Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
> http://www.pbbergs.com
> Please no e-mails, any questions should be posted in the NewsGroup
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
>|||Roger is 100% correct but these may be of interest to you as well:
http://www.karaszi.com/SQLServer/info_dont_shrink.asp Shrinking
considerations
http://www.nigelrivett.net/TransactionLogFileGrows_1.html Log File issues
http://www.support.microsoft.com/?id=317375 Log File Grows too big
http://www.support.microsoft.com/?id=110139 Log file filling up
http://www.support.microsoft.com/?id=315512 Considerations for Autogrow
and AutoShrink
http://www.support.microsoft.com/?id=272318 Shrinking Log in SQL Server
2000 with DBCC SHRINKFILE
http://www.support.microsoft.com/?id=873235 How to stop the log file from
growing
http://www.support.microsoft.com/?id=305635 Timeout while DB expanding
http://www.support.microsoft.com/?id=307487 Shrinking TempDB
--
Andrew J. Kelly SQL MVP
"Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
news:%23CHDBP%23vGHA.4512@.TK2MSFTNGP05.phx.gbl...
> I'll be the first to admin, I am not a sql expert. I have a nightly job
> that backs up the tran logs on my db's and the box is checked to
> autoshrink the tran log when it exceeds 100 mb. There are no errors and
> the shrink is executed as shown in the log reports but the size doesn't
> appear to change. What is really stumping me is if I do it manually, I
> have to do a tran log backup, shrink the db. This results in a few mb
> shrinkage. If I then go back and do the same thing again (This is
> consistent on four major DB's on this server) tran log backup followed by
> a shrink db it then shrinks the tranlog as expected. Several of these
> db's aren't activily being updated at the backup/shrink time so records
> aren't being inserted (At least not that I am aware of) at the time.
> Is there something I am not doing or an idea someone may have to do this?
> The backup and shrink are being handled via the setup by Enterprise
> Manager.
> --
> Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
> http://www.pbbergs.com
> Please no e-mails, any questions should be posted in the NewsGroup
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
>|||The transaction log backup is nightly, the attempted shrinking is weekly. I
don't believe a tran log needs to be 10 gig in size, considering the data
file is only 2 gig in size. I would have thought the nightly tran log back
up would force the tran log to re-use from the truncation point forward but
that doesn't seem to be occuring. The goal is to keep the tran log the max
log that could be used in a day, which is almost 2 gig in size.
Am I doing some thing wrong? Probably, but what I don't know.
--
Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
http://www.pbbergs.com
Please no e-mails, any questions should be posted in the NewsGroup
This posting is provided "AS IS" with no warranties, and confers no rights.
"Roger Wolter[MSFT]" <rwolter@.online.microsoft.com> wrote in message
news:e%23oT4sCwGHA.1436@.TK2MSFTNGP02.phx.gbl...
> An obvious question is why do you think you need to shrink the files every
> day? If they grow every day then all you're doing is slowing down your
> database every day because it has to grow the file. Do you tear down your
> garage every time you back your car out and build it again when you come
> home? This is about the same logic as shrinking the database and log
> files daily. You should only shrink the files when you know they aren't
> going to grow again.
> --
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
> Use of included script samples are subject to the terms specified at
> http://www.microsoft.com/info/cpyright.htm
> "Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
> news:%23CHDBP%23vGHA.4512@.TK2MSFTNGP05.phx.gbl...
>> I'll be the first to admin, I am not a sql expert. I have a nightly job
>> that backs up the tran logs on my db's and the box is checked to
>> autoshrink the tran log when it exceeds 100 mb. There are no errors and
>> the shrink is executed as shown in the log reports but the size doesn't
>> appear to change. What is really stumping me is if I do it manually, I
>> have to do a tran log backup, shrink the db. This results in a few mb
>> shrinkage. If I then go back and do the same thing again (This is
>> consistent on four major DB's on this server) tran log backup followed by
>> a shrink db it then shrinks the tranlog as expected. Several of these
>> db's aren't activily being updated at the backup/shrink time so records
>> aren't being inserted (At least not that I am aware of) at the time.
>> Is there something I am not doing or an idea someone may have to do this?
>> The backup and shrink are being handled via the setup by Enterprise
>> Manager.
>> --
>> Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
>> http://www.pbbergs.com
>> Please no e-mails, any questions should be posted in the NewsGroup
>> This posting is provided "AS IS" with no warranties, and confers no
>> rights.
>>
>|||Paul
You don't need to shrink the log , because it is meaningless as it will be
grown againg and again. One option is set up the log file with an appropiate
size that it does not need to grow frequently .
"Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
news:%23yRZMdGwGHA.1272@.TK2MSFTNGP05.phx.gbl...
> The transaction log backup is nightly, the attempted shrinking is weekly.
> I don't believe a tran log needs to be 10 gig in size, considering the
> data file is only 2 gig in size. I would have thought the nightly tran
> log back up would force the tran log to re-use from the truncation point
> forward but that doesn't seem to be occuring. The goal is to keep the
> tran log the max log that could be used in a day, which is almost 2 gig in
> size.
> Am I doing some thing wrong? Probably, but what I don't know.
> --
> Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
> http://www.pbbergs.com
> Please no e-mails, any questions should be posted in the NewsGroup
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
> "Roger Wolter[MSFT]" <rwolter@.online.microsoft.com> wrote in message
> news:e%23oT4sCwGHA.1436@.TK2MSFTNGP02.phx.gbl...
>> An obvious question is why do you think you need to shrink the files
>> every day? If they grow every day then all you're doing is slowing down
>> your database every day because it has to grow the file. Do you tear
>> down your garage every time you back your car out and build it again when
>> you come home? This is about the same logic as shrinking the database
>> and log files daily. You should only shrink the files when you know they
>> aren't going to grow again.
>> --
>> This posting is provided "AS IS" with no warranties, and confers no
>> rights.
>> Use of included script samples are subject to the terms specified at
>> http://www.microsoft.com/info/cpyright.htm
>> "Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
>> news:%23CHDBP%23vGHA.4512@.TK2MSFTNGP05.phx.gbl...
>> I'll be the first to admin, I am not a sql expert. I have a nightly job
>> that backs up the tran logs on my db's and the box is checked to
>> autoshrink the tran log when it exceeds 100 mb. There are no errors and
>> the shrink is executed as shown in the log reports but the size doesn't
>> appear to change. What is really stumping me is if I do it manually, I
>> have to do a tran log backup, shrink the db. This results in a few mb
>> shrinkage. If I then go back and do the same thing again (This is
>> consistent on four major DB's on this server) tran log backup followed
>> by a shrink db it then shrinks the tranlog as expected. Several of
>> these db's aren't activily being updated at the backup/shrink time so
>> records aren't being inserted (At least not that I am aware of) at the
>> time.
>> Is there something I am not doing or an idea someone may have to do
>> this? The backup and shrink are being handled via the setup by
>> Enterprise Manager.
>> --
>> Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
>> http://www.pbbergs.com
>> Please no e-mails, any questions should be posted in the NewsGroup
>> This posting is provided "AS IS" with no warranties, and confers no
>> rights.
>>
>>
>|||I don't think people understand my predicament.
I have four specific db's on my sql server 2000 server. I run these in Full
Recovery mode with nightly tran log and weekly full back ups. The log file
in some instances is more than 5 times the size of the db. I find it hard
to believe that this would be considered normal since a nightly job would
never have more info than the db itself.
If you can provide details as to why this is normal, please do.
--
Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
http://www.pbbergs.com
Please no e-mails, any questions should be posted in the NewsGroup
This posting is provided "AS IS" with no warranties, and confers no rights.
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:%238Ke8jGwGHA.4444@.TK2MSFTNGP05.phx.gbl...
> Paul
> You don't need to shrink the log , because it is meaningless as it will
> be grown againg and again. One option is set up the log file with an
> appropiate size that it does not need to grow frequently .
>
> "Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
> news:%23yRZMdGwGHA.1272@.TK2MSFTNGP05.phx.gbl...
>> The transaction log backup is nightly, the attempted shrinking is weekly.
>> I don't believe a tran log needs to be 10 gig in size, considering the
>> data file is only 2 gig in size. I would have thought the nightly tran
>> log back up would force the tran log to re-use from the truncation point
>> forward but that doesn't seem to be occuring. The goal is to keep the
>> tran log the max log that could be used in a day, which is almost 2 gig
>> in size.
>> Am I doing some thing wrong? Probably, but what I don't know.
>> --
>> Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
>> http://www.pbbergs.com
>> Please no e-mails, any questions should be posted in the NewsGroup
>> This posting is provided "AS IS" with no warranties, and confers no
>> rights.
>> "Roger Wolter[MSFT]" <rwolter@.online.microsoft.com> wrote in message
>> news:e%23oT4sCwGHA.1436@.TK2MSFTNGP02.phx.gbl...
>> An obvious question is why do you think you need to shrink the files
>> every day? If they grow every day then all you're doing is slowing down
>> your database every day because it has to grow the file. Do you tear
>> down your garage every time you back your car out and build it again
>> when you come home? This is about the same logic as shrinking the
>> database and log files daily. You should only shrink the files when you
>> know they aren't going to grow again.
>> --
>> This posting is provided "AS IS" with no warranties, and confers no
>> rights.
>> Use of included script samples are subject to the terms specified at
>> http://www.microsoft.com/info/cpyright.htm
>> "Paul Bergson" <pbergson@.allete_nospam.com> wrote in message
>> news:%23CHDBP%23vGHA.4512@.TK2MSFTNGP05.phx.gbl...
>> I'll be the first to admin, I am not a sql expert. I have a nightly
>> job that backs up the tran logs on my db's and the box is checked to
>> autoshrink the tran log when it exceeds 100 mb. There are no errors
>> and the shrink is executed as shown in the log reports but the size
>> doesn't appear to change. What is really stumping me is if I do it
>> manually, I have to do a tran log backup, shrink the db. This results
>> in a few mb shrinkage. If I then go back and do the same thing again
>> (This is consistent on four major DB's on this server) tran log backup
>> followed by a shrink db it then shrinks the tranlog as expected.
>> Several of these db's aren't activily being updated at the
>> backup/shrink time so records aren't being inserted (At least not that
>> I am aware of) at the time.
>> Is there something I am not doing or an idea someone may have to do
>> this? The backup and shrink are being handled via the setup by
>> Enterprise Manager.
>> --
>> Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
>> http://www.pbbergs.com
>> Please no e-mails, any questions should be posted in the NewsGroup
>> This posting is provided "AS IS" with no warranties, and confers no
>> rights.
>>
>>
>>
>|||Paul Bergson wrote:
> I don't think people understand my predicament.
> I have four specific db's on my sql server 2000 server. I run these in Full
> Recovery mode with nightly tran log and weekly full back ups. The log file
> in some instances is more than 5 times the size of the db. I find it hard
> to believe that this would be considered normal since a nightly job would
> never have more info than the db itself.
> If you can provide details as to why this is normal, please do.
>
Are you doing something like rebuilding indexes at night? Large imports?
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||Imports can be large
--
Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
http://www.pbbergs.com
Please no e-mails, any questions should be posted in the NewsGroup
This posting is provided "AS IS" with no warranties, and confers no rights.
"Tracy McKibben" <tracy@.realsqlguy.com> wrote in message
news:uKZQv8GwGHA.4296@.TK2MSFTNGP06.phx.gbl...
> Paul Bergson wrote:
>> I don't think people understand my predicament.
>> I have four specific db's on my sql server 2000 server. I run these in
>> Full Recovery mode with nightly tran log and weekly full back ups. The
>> log file in some instances is more than 5 times the size of the db. I
>> find it hard to believe that this would be considered normal since a
>> nightly job would never have more info than the db itself.
>> If you can provide details as to why this is normal, please do.
> Are you doing something like rebuilding indexes at night? Large imports?
>
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com|||Paul Bergson wrote:
> Imports can be large
>
Ok, that could explain the large transaction log file. Say you're
importing 100,000 new rows of data, all as one transaction. The
transaction log has to be able to hold that entire transaction, in case
it has to roll back.
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||This may turn out to be a mute point. I'm on the phone with the vendor and
they are some how using the Log Files to store log history. It sounds like
it is unrelated to the actual logs that are need for roll back. I don't get
it. I need to get more info if this is the case. It sounds to me like they
have a configuration option which could allow me to control history kept
within this.
Maybe this is normal use, seems odd to me though.
--
Paul Bergson MCT, MCSE, MCSA, Security+, CNE, CNA, CCA
http://www.pbbergs.com
Please no e-mails, any questions should be posted in the NewsGroup
This posting is provided "AS IS" with no warranties, and confers no rights.
"Tracy McKibben" <tracy@.realsqlguy.com> wrote in message
news:O57rkXHwGHA.1224@.TK2MSFTNGP03.phx.gbl...
> Paul Bergson wrote:
>> Imports can be large
> Ok, that could explain the large transaction log file. Say you're
> importing 100,000 new rows of data, all as one transaction. The
> transaction log has to be able to hold that entire transaction, in case it
> has to roll back.
>
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com