Showing posts with label text. Show all posts
Showing posts with label text. Show all posts

Thursday, March 29, 2012

Automating DBCC Checkdb

This is a multi-part message in MIME format.
--=_NextPart_000_0062_01C3E402.F8AADA80
Content-Type: text/plain;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
Sql Server 2000
If I ran a weekend job to perform a DBCC Checkdb on my databases, is = there an alert that I can monitor for which will inform me of any = problems discovered? Or is there a method to get the output of the DBCC = Checkdb to a text file? When running this command as a scheduled job, I = don't see the output I am in the habit of perusing for errors.
Please advise. Or tell me how you approach executing database = maintenance.
Thanx!
--=_NextPart_000_0062_01C3E402.F8AADA80
Content-Type: text/html;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
&
Sql Server 2000

If I ran a weekend job to perform a DBCC Checkdb on my databases, = is there an alert that I can monitor for which will inform me of any problems discovered? Or is there a method to get the output of the DBCC = Checkdb to a text file? When running this command as a scheduled job, I don't = see the output I am in the habit of perusing for errors.

Please advise. Or tell me how you approach executing database = maintenance.

Thanx!

--=_NextPart_000_0062_01C3E402.F8AADA80--Here's what I'd do:
Define two jobsteps, "at the end" of the job. One has subject "success", the
other "error". In these, you use xp_sendmail or xp_smtp_sendmail (I
recommend this: www.sqldev.net) In the "real" jobsteps, define "next step"
on success and on failure you go to the one where you send the email with
"failure".
In the DBCC jobstep, you define an output file, and in the mail jobsteps,
you specify that file as an attachment.
--
Tibor Karaszi, SQL Server MVP
Archive at:
http://groups.google.com/groups?oi=djq&as_ugroup=microsoft.public.sqlserver
"JLS" <jlshoop@.hotmail.com> wrote in message
news:OiAUHxC5DHA.2776@.TK2MSFTNGP09.phx.gbl...
Sql Server 2000
If I ran a weekend job to perform a DBCC Checkdb on my databases, is there
an alert that I can monitor for which will inform me of any problems
discovered? Or is there a method to get the output of the DBCC Checkdb to a
text file? When running this command as a scheduled job, I don't see the
output I am in the habit of perusing for errors.
Please advise. Or tell me how you approach executing database maintenance.
Thanx!|||This is a multi-part message in MIME format.
--=_NextPart_000_0044_01C3E41D.ECAE0D40
Content-Type: text/plain;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
Use sqlmaint.exe utility as described in BOL. It can sends out put to a =file. The file could be in text and /or html format.
EXEC xp_sqlmaint '-S SQLSERVER -PlanName "DBMAINTPLAN 1" -CkDB -Rpt =D:\Dbmaint\DBCC.out -DelTxtRpt 2months -HtmlRpt =D:\Dbmaint\DBCC_result.htm'
Richard
"JLS" <jlshoop@.hotmail.com> wrote in message =news:OiAUHxC5DHA.2776@.TK2MSFTNGP09.phx.gbl...
Sql Server 2000
If I ran a weekend job to perform a DBCC Checkdb on my databases, is =there an alert that I can monitor for which will inform me of any =problems discovered? Or is there a method to get the output of the DBCC =Checkdb to a text file? When running this command as a scheduled job, I =don't see the output I am in the habit of perusing for errors.
Please advise. Or tell me how you approach executing database =maintenance.
Thanx!
--=_NextPart_000_0044_01C3E41D.ECAE0D40
Content-Type: text/html;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
&

Use sqlmaint.exe utility as described =in BOL. It can sends out put to a file. The file could be in text and /or html format.
EXEC xp_sqlmaint ='-S SQLSERVER -PlanName "DBMAINTPLAN 1" -CkDB -Rpt D:\Dbmaint\DBCC.out =-DelTxtRpt 2months -HtmlRpt D:\Dbmaint\DBCC_result.htm'
Richard
"JLS" wrote =in message news:OiAUHxC5DHA.2776=@.TK2MSFTNGP09.phx.gbl...
Sql Server 2000

If I ran a weekend job to perform a DBCC Checkdb on my databases, =is there an alert that I can monitor for which will inform me of any =problems discovered? Or is there a method to get the output of the DBCC =Checkdb to a text file? When running this command as a scheduled job, I =don't see the output I am in the habit of perusing for errors.

Please advise. Or tell me how you approach executing =database maintenance.

Thanx!


--=_NextPart_000_0044_01C3E41D.ECAE0D40--|||This is a multi-part message in MIME format.
--=_NextPart_000_0012_01C3E4B6.7E0ABE80
Content-Type: text/plain;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
Thanx! Great advise, I like being pointed in the right direction =instead of spinning my wheels all over BOL.
"Richard Ding" <dingr@.cleanharbors.com> wrote in message =news:e8DcWfE5DHA.2432@.TK2MSFTNGP10.phx.gbl...
Use sqlmaint.exe utility as described in BOL. It can sends out put to =a file. The file could be in text and /or html format.
EXEC xp_sqlmaint '-S SQLSERVER -PlanName "DBMAINTPLAN 1" -CkDB -Rpt =D:\Dbmaint\DBCC.out -DelTxtRpt 2months -HtmlRpt =D:\Dbmaint\DBCC_result.htm'
Richard
"JLS" <jlshoop@.hotmail.com> wrote in message =news:OiAUHxC5DHA.2776@.TK2MSFTNGP09.phx.gbl...
Sql Server 2000
If I ran a weekend job to perform a DBCC Checkdb on my databases, is =there an alert that I can monitor for which will inform me of any =problems discovered? Or is there a method to get the output of the DBCC =Checkdb to a text file? When running this command as a scheduled job, I =don't see the output I am in the habit of perusing for errors.
Please advise. Or tell me how you approach executing database =maintenance.
Thanx!
--=_NextPart_000_0012_01C3E4B6.7E0ABE80
Content-Type: text/html;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
&

Thanx! Great advise, I like being pointed in the right =direction instead of spinning my wheels all over BOL.
"Richard Ding" =wrote in message news:e8DcWfE5DHA.2432=@.TK2MSFTNGP10.phx.gbl...
Use sqlmaint.exe utility as described =in BOL. It can sends out put to a file. The file could be in text and /or html format.
EXEC xp_sqlmaint ='-S SQLSERVER -PlanName "DBMAINTPLAN 1" -CkDB -Rpt D:\Dbmaint\DBCC.out =-DelTxtRpt 2months -HtmlRpt D:\Dbmaint\DBCC_result.htm'
Richard
"JLS" =wrote in message news:OiAUHxC5DHA.2776=@.TK2MSFTNGP09.phx.gbl...
Sql Server 2000

If I ran a weekend job to perform a DBCC Checkdb on my =databases, is there an alert that I can monitor for which will inform me of any =problems discovered? Or is there a method to get the output of the DBCC =Checkdb to a text file? When running this command as a scheduled job, =I don't see the output I am in the habit of perusing for errors.

Please advise. Or tell me how you approach executing =database maintenance.

Thanx!


--=_NextPart_000_0012_01C3E4B6.7E0ABE80--|||Thanx! Exactly the information I was looking for!
"Tibor Karaszi" <tibor_please.no.email_karaszi@.hotmail.nomail.com> wrote in
message news:OCrGg$C5DHA.2344@.TK2MSFTNGP09.phx.gbl...
> Here's what I'd do:
> Define two jobsteps, "at the end" of the job. One has subject "success",
the
> other "error". In these, you use xp_sendmail or xp_smtp_sendmail (I
> recommend this: www.sqldev.net) In the "real" jobsteps, define "next step"
> on success and on failure you go to the one where you send the email with
> "failure".
> In the DBCC jobstep, you define an output file, and in the mail jobsteps,
> you specify that file as an attachment.
> --
> Tibor Karaszi, SQL Server MVP
> Archive at:
>
http://groups.google.com/groups?oi=djq&as_ugroup=microsoft.public.sqlserver
>
> "JLS" <jlshoop@.hotmail.com> wrote in message
> news:OiAUHxC5DHA.2776@.TK2MSFTNGP09.phx.gbl...
> Sql Server 2000
> If I ran a weekend job to perform a DBCC Checkdb on my databases, is there
> an alert that I can monitor for which will inform me of any problems
> discovered? Or is there a method to get the output of the DBCC Checkdb to
a
> text file? When running this command as a scheduled job, I don't see the
> output I am in the habit of perusing for errors.
> Please advise. Or tell me how you approach executing database
maintenance.
> Thanx!
>
>

Tuesday, March 27, 2012

Automatically script out a table & indexes - script needed (2005)

I'm trying to automate an auto-export of a table on a daily basis,
using BCP. I'm using native format for the BCP because the text in
one of the fields can encompass pretty much any ASCII characters, and
using the other options (including the null terminator, stuff like
|||, etc) hasn't worked particularly well.
So, I'm archiving out a table on a daily basis. I want to script out
the table at the same time; that way, if there are any table changes,
an import will still work.
How can I do this? I've been digging through google for scripts with
no luck. Ideally I'd like a table-creation script, along with CREATE
INDEX statements.
Anybody have a script handy for this? I know it can be done by using
the system tables, but I'm hoping to avoid reinventing the wheel.
Thanks in advance.
Michael
M Bourgon (bourgon@.gmail.com) writes:
> I'm trying to automate an auto-export of a table on a daily basis,
> using BCP. I'm using native format for the BCP because the text in
> one of the fields can encompass pretty much any ASCII characters, and
> using the other options (including the null terminator, stuff like
>|||, etc) hasn't worked particularly well.
> So, I'm archiving out a table on a daily basis. I want to script out
> the table at the same time; that way, if there are any table changes,
> an import will still work.
> How can I do this? I've been digging through google for scripts with
> no luck. Ideally I'd like a table-creation script, along with CREATE
> INDEX statements.
> Anybody have a script handy for this? I know it can be done by using
> the system tables, but I'm hoping to avoid reinventing the wheel.
> Thanks in advance.
If you are on SQL 2005, you would use SMO for the scripting and on SQL 2000
it would be DMO. No, I don't have any examples, I have stayed away from
both.
Personally, I would prefer the definition of the table to be under version
control and be content with that.
But why use BCP as a backup tool? Why not simply BACKUP? Or are you
trying to tell us that this is the only table in a big database that
you want to back up?
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx
|||Kalen has written this script
SELECT type_desc,object_name(ic.object_id) as object_name , index_name =
i.name,
'column' = c.name,
'column usage' = CASE ic.is_included_column
WHEN 0 then 'KEY'
ELSE 'INCLUDED'
END
FROM sys.index_columns ic JOIN sys.columns c
ON ic.object_id = c.object_id
AND ic.column_id = c.column_id
JOIN sys.indexes i
ON i.object_id = ic.object_id
AND i.index_id = ic.index_id
"M Bourgon" <bourgon@.gmail.com> wrote in message
news:1184085398.736952.162110@.o61g2000hsh.googlegr oups.com...
> I'm trying to automate an auto-export of a table on a daily basis,
> using BCP. I'm using native format for the BCP because the text in
> one of the fields can encompass pretty much any ASCII characters, and
> using the other options (including the null terminator, stuff like
> |||, etc) hasn't worked particularly well.
> So, I'm archiving out a table on a daily basis. I want to script out
> the table at the same time; that way, if there are any table changes,
> an import will still work.
> How can I do this? I've been digging through google for scripts with
> no luck. Ideally I'd like a table-creation script, along with CREATE
> INDEX statements.
> Anybody have a script handy for this? I know it can be done by using
> the system tables, but I'm hoping to avoid reinventing the wheel.
> Thanks in advance.
> Michael
>
|||On Jul 10, 4:48 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
> Personally, I would prefer the definition of the table to be under version
> control and be content with that.
We're working on implementing several changes, that's one of them.

> But why use BCP as a backup tool? Why not simply BACKUP? Or are you
> trying to tell us that this is the only table in a big database that
> you want to back up?
Nope. We're working on moving to Partitioned Tables, but right
now we have home-grown partitioning, and we need to deal with old
"partitions".
|||M Bourgon (bourgon@.gmail.com) writes:
> On Jul 10, 4:48 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
> Nope. We're working on moving to Partitioned Tables, but right
> now we have home-grown partitioning, and we need to deal with old
> "partitions".
And the scripting is part of that? Maybe you could give more details?
If the main purpose is that the import of the BCP in native format will
work, maybe it sufficient to save the format file with the table? You
can create a format file from BCP with the format option. (You use "format"
in place of "in" or "out".)
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx
|||On Jul 11, 4:03 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
> M Bourgon (bour...@.gmail.com) writes:
>
> And the scripting is part of that? Maybe you could give more details?
Sure. We have a large table that constantly has new records added. We
want to be able to go back historically and pull from this data set.
The way we currently do it is to keep several days (the "hot" data)
available in one table (whilst auto-archiving 1 days' data out to a
table on a daily basis) several weeks available by a partitioned view,
and archive the older tables. This way, when we get requests for
older data, we can easily look in a set of tables for the results. We
split it up by day due to volume, and also because most of our
requests are "this subset of data, from date A to date B, further
filtered". We've discussed different ways of keeping the data
available, as what takes up the least amount of space is not
necessarily the easiest to query. I like the idea of keeping the data
in a database because then we can easily query it, and we don't have
to worry about any issues (i.e. if we save it out, and the table
format changes, we're don't run into issues months from now when we
try to load the data).
One question you'll probably ask - how do you create the one-days-
worth-of-data table? Unfortunately, that's just a dumb script also,
destined to break if we wind up changing the format of the table.

> If the main purpose is that the import of the BCP in native format will
> work, maybe it sufficient to save the format file with the table? You
> can create a format file from BCP with the format option. (You use "format"
> in place of "in" or "out".)
Can the format file be used instead of DDL? I'll have to try it.
Thanks.
|||M Bourgon (bourgon@.gmail.com) writes:
> Sure. We have a large table that constantly has new records added. We
> want to be able to go back historically and pull from this data set.
> The way we currently do it is to keep several days (the "hot" data)
> available in one table (whilst auto-archiving 1 days' data out to a
> table on a daily basis) several weeks available by a partitioned view,
> and archive the older tables. This way, when we get requests for
> older data, we can easily look in a set of tables for the results. We
> split it up by day due to volume, and also because most of our
> requests are "this subset of data, from date A to date B, further
> filtered".
Maybe a very nave and silly question, but what about a clustered index
on the big table? If you have a date range and clustered index to match
that range, it's fairly irrelevant if the table has 500 million rows.
The major reasons to partition a table I know of are:
1) Being able to quickly drop old data or add new data, by shifting
a table out or in.
2) Spread the load over different file groups.
But it sounds that you in your case keep the data, so that reason to
partition is out.

> We've discussed different ways of keeping the data available, as what
> takes up the least amount of space is not necessarily the easiest to
> query. I like the idea of keeping the data in a database because then
> we can easily query it, and we don't have to worry about any issues
> (i.e. if we save it out, and the table format changes, we're don't run
> into issues months from now when we try to load the data).
But if you keep the data in the database, why then BCP?

> One question you'll probably ask - how do you create the one-days-
> worth-of-data table? Unfortunately, that's just a dumb script also,
> destined to break if we wind up changing the format of the table.
I don't see that much of a problem. I would not expect frequent schema
changes to a table of this size. Having to update one script extra
when you actually do is not that big deal. Although for a plain copy,
you could use SELECT INTO. That would not give the constraints,
triggers and indexes though.
What I am a little more curious is what happens to all those daily
tables that all of a sudden has an obsolete definition.

> Can the format file be used instead of DDL? I'll have to try it.
No, format file has nothing to do with DLL. But my thinking was that
if you saved the BCP file, and then want to import three months later,
the format files relates the format of the file. Assuming that you only
add new columns at the end not drop any, it would import out of the box.
Else you would have to edit column-mapping in the format file.
If you want to script the table, the you are probably best off with DMO
on SQL 2000 and SMO on SQL 2005. I have not worked with either, so I can't
help.
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx
|||On Jul 17, 4:37 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
(sorry for delay on response)

> Maybe a very nave and silly question, but what about a clustered index
> on the big table? If you have a date range and clustered index to match
> that range, it's fairly irrelevant if the table has 500 million rows.
Right. The problem isn't accessing the data in a reasonable amount of
time, it's space issues. We need to have X days available, but we
also need to be able to move it offline (and back online if we need to
query it) in order to free up space. Disk space is cheap, but it's
not that cheap.

> The major reasons to partition a table I know of are:
> 1) Being able to quickly drop old data or add new data, by shifting
> a table out or in.
> 2) Spread the load over different file groups.
3) Make sure of disk space.

> But if you keep the data in the database, why then BCP?
As the data ages, it needs to be pulled out.

> I don't see that much of a problem. I would not expect frequent schema
> changes to a table of this size. Having to update one script extra
> when you actually do is not that big deal. Although for a plain copy,
> you could use SELECT INTO. That would not give the constraints,
> triggers and indexes though.
Okay. I was hoping there was a way to do it.

> What I am a little more curious is what happens to all those daily
> tables that all of a sudden has an obsolete definition.
The change in definition will be relatively minor. Instead of an INT
field (for ID, for instance), we need to move to BIGINT.

> If you want to script the table, the you are probably best off with DMO
> on SQL 2000 and SMO on SQL 2005. I have not worked with either, so I can't
> help.
Fair enough. I appreciate all the help, Erland. Thank you.
sql

Sunday, March 25, 2012

Automatically script out a table & indexes - script needed (2005)

I'm trying to automate an auto-export of a table on a daily basis,
using BCP. I'm using native format for the BCP because the text in
one of the fields can encompass pretty much any ASCII characters, and
using the other options (including the null terminator, stuff like
|||, etc) hasn't worked particularly well.

So, I'm archiving out a table on a daily basis. I want to script out
the table at the same time; that way, if there are any table changes,
an import will still work.

How can I do this? I've been digging through google for scripts with
no luck. Ideally I'd like a table-creation script, along with CREATE
INDEX statements.

Anybody have a script handy for this? I know it can be done by using
the system tables, but I'm hoping to avoid reinventing the wheel.
Thanks in advance.

MichaelM Bourgon (bourgon@.gmail.com) writes:

Quote:

Originally Posted by

I'm trying to automate an auto-export of a table on a daily basis,
using BCP. I'm using native format for the BCP because the text in
one of the fields can encompass pretty much any ASCII characters, and
using the other options (including the null terminator, stuff like
>|||, etc) hasn't worked particularly well.
>
So, I'm archiving out a table on a daily basis. I want to script out
the table at the same time; that way, if there are any table changes,
an import will still work.
>
How can I do this? I've been digging through google for scripts with
no luck. Ideally I'd like a table-creation script, along with CREATE
INDEX statements.
>
Anybody have a script handy for this? I know it can be done by using
the system tables, but I'm hoping to avoid reinventing the wheel.
Thanks in advance.


If you are on SQL 2005, you would use SMO for the scripting and on SQL 2000
it would be DMO. No, I don't have any examples, I have stayed away from
both.

Personally, I would prefer the definition of the table to be under version
control and be content with that.

But why use BCP as a backup tool? Why not simply BACKUP? Or are you
trying to tell us that this is the only table in a big database that
you want to back up?

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||Kalen has written this script
SELECT type_desc,object_name(ic.object_id) as object_name , index_name =
i.name,

'column' = c.name,

'column usage' = CASE ic.is_included_column

WHEN 0 then 'KEY'

ELSE 'INCLUDED'

END

FROM sys.index_columns ic JOIN sys.columns c

ON ic.object_id = c.object_id

AND ic.column_id = c.column_id

JOIN sys.indexes i

ON i.object_id = ic.object_id

AND i.index_id = ic.index_id

"M Bourgon" <bourgon@.gmail.comwrote in message
news:1184085398.736952.162110@.o61g2000hsh.googlegr oups.com...

Quote:

Originally Posted by

I'm trying to automate an auto-export of a table on a daily basis,
using BCP. I'm using native format for the BCP because the text in
one of the fields can encompass pretty much any ASCII characters, and
using the other options (including the null terminator, stuff like
|||, etc) hasn't worked particularly well.
>
So, I'm archiving out a table on a daily basis. I want to script out
the table at the same time; that way, if there are any table changes,
an import will still work.
>
How can I do this? I've been digging through google for scripts with
no luck. Ideally I'd like a table-creation script, along with CREATE
INDEX statements.
>
Anybody have a script handy for this? I know it can be done by using
the system tables, but I'm hoping to avoid reinventing the wheel.
Thanks in advance.
>
Michael
>

Automatically script out a table & indexes - script needed (2005)

I'm trying to automate an auto-export of a table on a daily basis,
using BCP. I'm using native format for the BCP because the text in
one of the fields can encompass pretty much any ASCII characters, and
using the other options (including the null terminator, stuff like
|||, etc) hasn't worked particularly well.
So, I'm archiving out a table on a daily basis. I want to script out
the table at the same time; that way, if there are any table changes,
an import will still work.
How can I do this? I've been digging through google for scripts with
no luck. Ideally I'd like a table-creation script, along with CREATE
INDEX statements.
Anybody have a script handy for this? I know it can be done by using
the system tables, but I'm hoping to avoid reinventing the wheel.
Thanks in advance.
MichaelM Bourgon (bourgon@.gmail.com) writes:
> I'm trying to automate an auto-export of a table on a daily basis,
> using BCP. I'm using native format for the BCP because the text in
> one of the fields can encompass pretty much any ASCII characters, and
> using the other options (including the null terminator, stuff like
>|||, etc) hasn't worked particularly well.
> So, I'm archiving out a table on a daily basis. I want to script out
> the table at the same time; that way, if there are any table changes,
> an import will still work.
> How can I do this? I've been digging through google for scripts with
> no luck. Ideally I'd like a table-creation script, along with CREATE
> INDEX statements.
> Anybody have a script handy for this? I know it can be done by using
> the system tables, but I'm hoping to avoid reinventing the wheel.
> Thanks in advance.
If you are on SQL 2005, you would use SMO for the scripting and on SQL 2000
it would be DMO. No, I don't have any examples, I have stayed away from
both.
Personally, I would prefer the definition of the table to be under version
control and be content with that.
But why use BCP as a backup tool? Why not simply BACKUP? Or are you
trying to tell us that this is the only table in a big database that
you want to back up?
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx|||Kalen has written this script
SELECT type_desc,object_name(ic.object_id) as object_name , index_name =i.name,
'column' = c.name,
'column usage' = CASE ic.is_included_column
WHEN 0 then 'KEY'
ELSE 'INCLUDED'
END
FROM sys.index_columns ic JOIN sys.columns c
ON ic.object_id = c.object_id
AND ic.column_id = c.column_id
JOIN sys.indexes i
ON i.object_id = ic.object_id
AND i.index_id = ic.index_id
"M Bourgon" <bourgon@.gmail.com> wrote in message
news:1184085398.736952.162110@.o61g2000hsh.googlegroups.com...
> I'm trying to automate an auto-export of a table on a daily basis,
> using BCP. I'm using native format for the BCP because the text in
> one of the fields can encompass pretty much any ASCII characters, and
> using the other options (including the null terminator, stuff like
> |||, etc) hasn't worked particularly well.
> So, I'm archiving out a table on a daily basis. I want to script out
> the table at the same time; that way, if there are any table changes,
> an import will still work.
> How can I do this? I've been digging through google for scripts with
> no luck. Ideally I'd like a table-creation script, along with CREATE
> INDEX statements.
> Anybody have a script handy for this? I know it can be done by using
> the system tables, but I'm hoping to avoid reinventing the wheel.
> Thanks in advance.
> Michael
>|||On Jul 10, 4:48 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
> Personally, I would prefer the definition of the table to be under version
> control and be content with that.
We're working on implementing several changes, that's one of them.
> But why use BCP as a backup tool? Why not simply BACKUP? Or are you
> trying to tell us that this is the only table in a big database that
> you want to back up?
Nope. :) We're working on moving to Partitioned Tables, but right
now we have home-grown partitioning, and we need to deal with old
"partitions".|||M Bourgon (bourgon@.gmail.com) writes:
> On Jul 10, 4:48 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
>> But why use BCP as a backup tool? Why not simply BACKUP? Or are you
>> trying to tell us that this is the only table in a big database that
>> you want to back up?
> Nope. :) We're working on moving to Partitioned Tables, but right
> now we have home-grown partitioning, and we need to deal with old
> "partitions".
And the scripting is part of that? Maybe you could give more details?
If the main purpose is that the import of the BCP in native format will
work, maybe it sufficient to save the format file with the table? You
can create a format file from BCP with the format option. (You use "format"
in place of "in" or "out".)
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx|||On Jul 11, 4:03 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
> M Bourgon (bour...@.gmail.com) writes:
> > On Jul 10, 4:48 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
> >> But why use BCP as a backup tool? Why not simply BACKUP? Or are you
> >> trying to tell us that this is the only table in a big database that
> >> you want to back up?
> > Nope. :) We're working on moving to Partitioned Tables, but right
> > now we have home-grown partitioning, and we need to deal with old
> > "partitions".
> And the scripting is part of that? Maybe you could give more details?
Sure. We have a large table that constantly has new records added. We
want to be able to go back historically and pull from this data set.
The way we currently do it is to keep several days (the "hot" data)
available in one table (whilst auto-archiving 1 days' data out to a
table on a daily basis) several weeks available by a partitioned view,
and archive the older tables. This way, when we get requests for
older data, we can easily look in a set of tables for the results. We
split it up by day due to volume, and also because most of our
requests are "this subset of data, from date A to date B, further
filtered". We've discussed different ways of keeping the data
available, as what takes up the least amount of space is not
necessarily the easiest to query. I like the idea of keeping the data
in a database because then we can easily query it, and we don't have
to worry about any issues (i.e. if we save it out, and the table
format changes, we're don't run into issues months from now when we
try to load the data).
One question you'll probably ask - how do you create the one-days-
worth-of-data table? Unfortunately, that's just a dumb script also,
destined to break if we wind up changing the format of the table.
> If the main purpose is that the import of the BCP in native format will
> work, maybe it sufficient to save the format file with the table? You
> can create a format file from BCP with the format option. (You use "format"
> in place of "in" or "out".)
Can the format file be used instead of DDL? I'll have to try it.
Thanks.|||M Bourgon (bourgon@.gmail.com) writes:
> Sure. We have a large table that constantly has new records added. We
> want to be able to go back historically and pull from this data set.
> The way we currently do it is to keep several days (the "hot" data)
> available in one table (whilst auto-archiving 1 days' data out to a
> table on a daily basis) several weeks available by a partitioned view,
> and archive the older tables. This way, when we get requests for
> older data, we can easily look in a set of tables for the results. We
> split it up by day due to volume, and also because most of our
> requests are "this subset of data, from date A to date B, further
> filtered".
Maybe a very naïve and silly question, but what about a clustered index
on the big table? If you have a date range and clustered index to match
that range, it's fairly irrelevant if the table has 500 million rows.
The major reasons to partition a table I know of are:
1) Being able to quickly drop old data or add new data, by shifting
a table out or in.
2) Spread the load over different file groups.
But it sounds that you in your case keep the data, so that reason to
partition is out.
> We've discussed different ways of keeping the data available, as what
> takes up the least amount of space is not necessarily the easiest to
> query. I like the idea of keeping the data in a database because then
> we can easily query it, and we don't have to worry about any issues
> (i.e. if we save it out, and the table format changes, we're don't run
> into issues months from now when we try to load the data).
But if you keep the data in the database, why then BCP?
> One question you'll probably ask - how do you create the one-days-
> worth-of-data table? Unfortunately, that's just a dumb script also,
> destined to break if we wind up changing the format of the table.
I don't see that much of a problem. I would not expect frequent schema
changes to a table of this size. Having to update one script extra
when you actually do is not that big deal. Although for a plain copy,
you could use SELECT INTO. That would not give the constraints,
triggers and indexes though.
What I am a little more curious is what happens to all those daily
tables that all of a sudden has an obsolete definition.
>> If the main purpose is that the import of the BCP in native format will
>> work, maybe it sufficient to save the format file with the table? You
>> can create a format file from BCP with the format option. (You use
>> "format" in place of "in" or "out".)
> Can the format file be used instead of DDL? I'll have to try it.
No, format file has nothing to do with DLL. But my thinking was that
if you saved the BCP file, and then want to import three months later,
the format files relates the format of the file. Assuming that you only
add new columns at the end not drop any, it would import out of the box.
Else you would have to edit column-mapping in the format file.
If you want to script the table, the you are probably best off with DMO
on SQL 2000 and SMO on SQL 2005. I have not worked with either, so I can't
help.
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx|||On Jul 17, 4:37 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
(sorry for delay on response)
> Maybe a very na=EFve and silly question, but what about a clustered index
> on the big table? If you have a date range and clustered index to match
> that range, it's fairly irrelevant if the table has 500 million rows.
Right. The problem isn't accessing the data in a reasonable amount of
time, it's space issues. We need to have X days available, but we
also need to be able to move it offline (and back online if we need to
query it) in order to free up space. Disk space is cheap, but it's
not that cheap.
> The major reasons to partition a table I know of are:
> 1) Being able to quickly drop old data or add new data, by shifting
> a table out or in.
> 2) Spread the load over different file groups.
3) Make sure of disk space. :)
> But if you keep the data in the database, why then BCP?
As the data ages, it needs to be pulled out.
> I don't see that much of a problem. I would not expect frequent schema
> changes to a table of this size. Having to update one script extra
> when you actually do is not that big deal. Although for a plain copy,
> you could use SELECT INTO. That would not give the constraints,
> triggers and indexes though.
Okay. I was hoping there was a way to do it.
> What I am a little more curious is what happens to all those daily
> tables that all of a sudden has an obsolete definition.
The change in definition will be relatively minor. Instead of an INT
field (for ID, for instance), we need to move to BIGINT.
> If you want to script the table, the you are probably best off with DMO
> on SQL 2000 and SMO on SQL 2005. I have not worked with either, so I can't
> help.
Fair enough. I appreciate all the help, Erland. Thank you.

Automatically script out a table & indexes - script needed (2005)

I'm trying to automate an auto-export of a table on a daily basis,
using BCP. I'm using native format for the BCP because the text in
one of the fields can encompass pretty much any ASCII characters, and
using the other options (including the null terminator, stuff like
|||, etc) hasn't worked particularly well.
So, I'm archiving out a table on a daily basis. I want to script out
the table at the same time; that way, if there are any table changes,
an import will still work.
How can I do this? I've been digging through google for scripts with
no luck. Ideally I'd like a table-creation script, along with CREATE
INDEX statements.
Anybody have a script handy for this? I know it can be done by using
the system tables, but I'm hoping to avoid reinventing the wheel.
Thanks in advance.
MichaelM Bourgon (bourgon@.gmail.com) writes:
> I'm trying to automate an auto-export of a table on a daily basis,
> using BCP. I'm using native format for the BCP because the text in
> one of the fields can encompass pretty much any ASCII characters, and
> using the other options (including the null terminator, stuff like
>|||, etc) hasn't worked particularly well.
> So, I'm archiving out a table on a daily basis. I want to script out
> the table at the same time; that way, if there are any table changes,
> an import will still work.
> How can I do this? I've been digging through google for scripts with
> no luck. Ideally I'd like a table-creation script, along with CREATE
> INDEX statements.
> Anybody have a script handy for this? I know it can be done by using
> the system tables, but I'm hoping to avoid reinventing the wheel.
> Thanks in advance.
If you are on SQL 2005, you would use SMO for the scripting and on SQL 2000
it would be DMO. No, I don't have any examples, I have stayed away from
both.
Personally, I would prefer the definition of the table to be under version
control and be content with that.
But why use BCP as a backup tool? Why not simply BACKUP? Or are you
trying to tell us that this is the only table in a big database that
you want to back up?
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||Kalen has written this script
SELECT type_desc,object_name(ic.object_id) as object_name , index_name =
i.name,
'column' = c.name,
'column usage' = CASE ic.is_included_column
WHEN 0 then 'KEY'
ELSE 'INCLUDED'
END
FROM sys.index_columns ic JOIN sys.columns c
ON ic.object_id = c.object_id
AND ic.column_id = c.column_id
JOIN sys.indexes i
ON i.object_id = ic.object_id
AND i.index_id = ic.index_id
"M Bourgon" <bourgon@.gmail.com> wrote in message
news:1184085398.736952.162110@.o61g2000hsh.googlegroups.com...
> I'm trying to automate an auto-export of a table on a daily basis,
> using BCP. I'm using native format for the BCP because the text in
> one of the fields can encompass pretty much any ASCII characters, and
> using the other options (including the null terminator, stuff like
> |||, etc) hasn't worked particularly well.
> So, I'm archiving out a table on a daily basis. I want to script out
> the table at the same time; that way, if there are any table changes,
> an import will still work.
> How can I do this? I've been digging through google for scripts with
> no luck. Ideally I'd like a table-creation script, along with CREATE
> INDEX statements.
> Anybody have a script handy for this? I know it can be done by using
> the system tables, but I'm hoping to avoid reinventing the wheel.
> Thanks in advance.
> Michael
>|||On Jul 10, 4:48 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
> Personally, I would prefer the definition of the table to be under version
> control and be content with that.
We're working on implementing several changes, that's one of them.

> But why use BCP as a backup tool? Why not simply BACKUP? Or are you
> trying to tell us that this is the only table in a big database that
> you want to back up?
Nope. We're working on moving to Partitioned Tables, but right
now we have home-grown partitioning, and we need to deal with old
"partitions".|||M Bourgon (bourgon@.gmail.com) writes:
> On Jul 10, 4:48 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
> Nope. We're working on moving to Partitioned Tables, but right
> now we have home-grown partitioning, and we need to deal with old
> "partitions".
And the scripting is part of that? Maybe you could give more details?
If the main purpose is that the import of the BCP in native format will
work, maybe it sufficient to save the format file with the table? You
can create a format file from BCP with the format option. (You use "format"
in place of "in" or "out".)
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||On Jul 11, 4:03 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
> M Bourgon (bour...@.gmail.com) writes:
>
> And the scripting is part of that? Maybe you could give more details?
Sure. We have a large table that constantly has new records added. We
want to be able to go back historically and pull from this data set.
The way we currently do it is to keep several days (the "hot" data)
available in one table (whilst auto-archiving 1 days' data out to a
table on a daily basis) several weeks available by a partitioned view,
and archive the older tables. This way, when we get requests for
older data, we can easily look in a set of tables for the results. We
split it up by day due to volume, and also because most of our
requests are "this subset of data, from date A to date B, further
filtered". We've discussed different ways of keeping the data
available, as what takes up the least amount of space is not
necessarily the easiest to query. I like the idea of keeping the data
in a database because then we can easily query it, and we don't have
to worry about any issues (i.e. if we save it out, and the table
format changes, we're don't run into issues months from now when we
try to load the data).
One question you'll probably ask - how do you create the one-days-
worth-of-data table? Unfortunately, that's just a dumb script also,
destined to break if we wind up changing the format of the table.

> If the main purpose is that the import of the BCP in native format will
> work, maybe it sufficient to save the format file with the table? You
> can create a format file from BCP with the format option. (You use "format
"
> in place of "in" or "out".)
Can the format file be used instead of DDL? I'll have to try it.
Thanks.|||M Bourgon (bourgon@.gmail.com) writes:
> Sure. We have a large table that constantly has new records added. We
> want to be able to go back historically and pull from this data set.
> The way we currently do it is to keep several days (the "hot" data)
> available in one table (whilst auto-archiving 1 days' data out to a
> table on a daily basis) several weeks available by a partitioned view,
> and archive the older tables. This way, when we get requests for
> older data, we can easily look in a set of tables for the results. We
> split it up by day due to volume, and also because most of our
> requests are "this subset of data, from date A to date B, further
> filtered".
Maybe a very nave and silly question, but what about a clustered index
on the big table? If you have a date range and clustered index to match
that range, it's fairly irrelevant if the table has 500 million rows.
The major reasons to partition a table I know of are:
1) Being able to quickly drop old data or add new data, by shifting
a table out or in.
2) Spread the load over different file groups.
But it sounds that you in your case keep the data, so that reason to
partition is out.

> We've discussed different ways of keeping the data available, as what
> takes up the least amount of space is not necessarily the easiest to
> query. I like the idea of keeping the data in a database because then
> we can easily query it, and we don't have to worry about any issues
> (i.e. if we save it out, and the table format changes, we're don't run
> into issues months from now when we try to load the data).
But if you keep the data in the database, why then BCP?

> One question you'll probably ask - how do you create the one-days-
> worth-of-data table? Unfortunately, that's just a dumb script also,
> destined to break if we wind up changing the format of the table.
I don't see that much of a problem. I would not expect frequent schema
changes to a table of this size. Having to update one script extra
when you actually do is not that big deal. Although for a plain copy,
you could use SELECT INTO. That would not give the constraints,
triggers and indexes though.
What I am a little more curious is what happens to all those daily
tables that all of a sudden has an obsolete definition.

> Can the format file be used instead of DDL? I'll have to try it.
No, format file has nothing to do with DLL. But my thinking was that
if you saved the BCP file, and then want to import three months later,
the format files relates the format of the file. Assuming that you only
add new columns at the end not drop any, it would import out of the box.
Else you would have to edit column-mapping in the format file.
If you want to script the table, the you are probably best off with DMO
on SQL 2000 and SMO on SQL 2005. I have not worked with either, so I can't
help.
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||On Jul 17, 4:37 pm, Erland Sommarskog <esq...@.sommarskog.se> wrote:
(sorry for delay on response)

> Maybe a very na=EFve and silly question, but what about a clustered index
> on the big table? If you have a date range and clustered index to match
> that range, it's fairly irrelevant if the table has 500 million rows.
Right. The problem isn't accessing the data in a reasonable amount of
time, it's space issues. We need to have X days available, but we
also need to be able to move it offline (and back online if we need to
query it) in order to free up space. Disk space is cheap, but it's
not that cheap.

> The major reasons to partition a table I know of are:
> 1) Being able to quickly drop old data or add new data, by shifting
> a table out or in.
> 2) Spread the load over different file groups.
3) Make sure of disk space.

> But if you keep the data in the database, why then BCP?
As the data ages, it needs to be pulled out.

> I don't see that much of a problem. I would not expect frequent schema
> changes to a table of this size. Having to update one script extra
> when you actually do is not that big deal. Although for a plain copy,
> you could use SELECT INTO. That would not give the constraints,
> triggers and indexes though.
Okay. I was hoping there was a way to do it.

> What I am a little more curious is what happens to all those daily
> tables that all of a sudden has an obsolete definition.
The change in definition will be relatively minor. Instead of an INT
field (for ID, for instance), we need to move to BIGINT.

> If you want to script the table, the you are probably best off with DMO
> on SQL 2000 and SMO on SQL 2005. I have not worked with either, so I can't
> help.
Fair enough. I appreciate all the help, Erland. Thank you.

Tuesday, March 20, 2012

Automatic text completion

I'm not entirely sure this is the place for it but I need to implement an automatic text completion function.

I'd like to know if there is something, such as a built-in function, that could help me.
The best idea I have is to create some sort of node tree and work with that but there has to be a better way.

I need to do it in C# or in MS SQL. Any kind of help is much appreciated!
YOu should either write your request to a platform group, e.g. WIndows Applications with C# or Web.Applications with C#, this mainly has nothing do to with SQL Server, more with the frontend you will create.

Jens K. Suessmeyer.

http://www.sqlserver2005.de
|||Actually I implemented it with SQL and it works rather well I must admit.
It returns:
"Emma Strandberg"
Since there is a Firstname Emma and her last name is Strandberg

SQL Code

DECLARE @.match varchar(15)

SET @.match = 'Em'

/* SET NOCOUNT ON */

SELECT (Firstname + ' ' + Lastname) AS Wholename
FROM CMR_Arrivals
WHERE SUBSTRING(Lastname, 1, LEN(@.match)) = @.match
OR SUBSTRING(Firstname, 1, LEN(@.match)) = @.match

|||

And it will also return 'Bill Emerson'

However, you asked about

Automatic text completion

. Your 'solution' has absolutely nothing to do with auto-completion.

Yes, your search 'kinda' works to find the data you seek. One consideration is that by wrapping the column with a function, substring(), you guarantee that the process will NOT be able to efficiently use indexing. On a large table, that can be a bit slow.

You may wish to refer to Books Online about the use of Wildcards in search criteria.

See Books Online, Topics:

Wildcards LIKE 'Pattern Matching in Search Conditions'|||

The query will be used with AJAX AutoComplete(Extender Toolkit) and the results will be limited hence the returned value wont be so many.

You are right in that it's not exactly what I was originally looking for!

Peter Ritchie (MVP) suggested:

" Sounds like what you want is AutoComplete, e.g. TextBox.AutoCompleteCustomSource.

Ken Getz has a good overview of the feature here: http://code-magazine.com/Article.aspx?quickid=0509111"

Where I replied:

" Sounds like something I was looking for BUT I was going to use the AutoCompleteExtender in the ASP.NET AJAX Toolkit.

Now I don't mind skipping the AJAX control but when looking at a Textbox the only option I have is AutoCompleteType.

None of the other functions such as AutoCompleteCustomSource are available.

I have .Net 2.0 installed and does say TextBox 2.0.0.0 .Net when holding the mouse over the control in VS2005.

Am I missing something here?
"

|||

Typically, he way 'autocomplete' functionality works is that a set of data has been retrieved from the database, as as the user types into the textbox control, the textchanged event is used to update the filter on the data.

Alternatively, on each keystroke, the textchanged event could be used to re-run the query to the data server.

BUT that is a very bad idea in terms of system performance.

Automatic text completion

I'm not entirely sure this is the place for it but I need to implement an automatic text completion function.

I'd like to know if there is something, such as a built-in function, that could help me.
The best idea I have is to create some sort of node tree and work with that but there has to be a better way.

I need to do it in C# or in MS SQL. Any kind of help is much appreciated!
YOu should either write your request to a platform group, e.g. WIndows Applications with C# or Web.Applications with C#, this mainly has nothing do to with SQL Server, more with the frontend you will create.

Jens K. Suessmeyer.

http://www.sqlserver2005.de
|||Actually I implemented it with SQL and it works rather well I must admit.
It returns:
"Emma Strandberg"
Since there is a Firstname Emma and her last name is Strandberg

SQL Code

DECLARE @.match varchar(15)

SET @.match = 'Em'

/* SET NOCOUNT ON */

SELECT (Firstname + ' ' + Lastname) AS Wholename
FROM CMR_Arrivals
WHERE SUBSTRING(Lastname, 1, LEN(@.match)) = @.match
OR SUBSTRING(Firstname, 1, LEN(@.match)) = @.match

|||

And it will also return 'Bill Emerson'

However, you asked about

Automatic text completion

. Your 'solution' has absolutely nothing to do with auto-completion.

Yes, your search 'kinda' works to find the data you seek. One consideration is that by wrapping the column with a function, substring(), you guarantee that the process will NOT be able to efficiently use indexing. On a large table, that can be a bit slow.

You may wish to refer to Books Online about the use of Wildcards in search criteria.

See Books Online, Topics:

Wildcards LIKE 'Pattern Matching in Search Conditions'|||

The query will be used with AJAX AutoComplete(Extender Toolkit) and the results will be limited hence the returned value wont be so many.

You are right in that it's not exactly what I was originally looking for!

Peter Ritchie (MVP) suggested:

" Sounds like what you want is AutoComplete, e.g. TextBox.AutoCompleteCustomSource.

Ken Getz has a good overview of the feature here: http://code-magazine.com/Article.aspx?quickid=0509111"

Where I replied:

" Sounds like something I was looking for BUT I was going to use the AutoCompleteExtender in the ASP.NET AJAX Toolkit.

Now I don't mind skipping the AJAX control but when looking at a Textbox the only option I have is AutoCompleteType.

None of the other functions such as AutoCompleteCustomSource are available.

I have .Net 2.0 installed and does say TextBox 2.0.0.0 .Net when holding the mouse over the control in VS2005.

Am I missing something here?
"

|||

Typically, he way 'autocomplete' functionality works is that a set of data has been retrieved from the database, as as the user types into the textbox control, the textchanged event is used to update the filter on the data.

Alternatively, on each keystroke, the textchanged event could be used to re-run the query to the data server.

BUT that is a very bad idea in terms of system performance.

Sunday, March 11, 2012

Automatic Email subject

Good Morning,

I have created a varible in a report which displays as a text box i.e contains data like "LWD Apps 450" where the value will change each day. The report is emiled to a list of users, is there a way to automatically enter this value so that it is in the subject bar when the email is automatically sent?

Thanks in advance.

Steve

Hi Steve,

The entry form for subscriptions only allows for two variables (@.ReportName and @.ExecutionTime).

The only way I can think to modify the subscription on the fly would be to each day programmatically create a new subscription with a subject. Here's an msdn article with sample code:
http://msdn2.microsoft.com/en-us/library/microsoft.wssux.reportingserviceswebservice.rsmanagementservice2005.reportingservice2005.createsubscription.aspx

In the code, you would perform whatever operation you needed to get the same information that the RDL is getting. Then instead of this line:
extensionParams(4).Value = "@.ReportName was executed at @.ExecutionTime"
you would use this line:
extensionParams(4).Value = "LWD Apps " & variableThatIJustRetrieved

You would set the report to run once and the time to run right then (or a minute in the future). You would also want to delete the subscription after it has run, so that you would have a clean slate for tomorrow's run.

It's definitely kludgy though. Does anyone else have any thoughts?
-Jessica

Automatic Email subject

Good Morning,

I have created a varible in a report which displays as a text box i.e contains data like "LWD Apps 450" where the value will change each day. The report is emiled to a list of users, is there a way to automatically enter this value so that it is in the subject bar when the email is automatically sent?

Thanks in advance.

Steve

Hi Steve,

The entry form for subscriptions only allows for two variables (@.ReportName and @.ExecutionTime).

The only way I can think to modify the subscription on the fly would be to each day programmatically create a new subscription with a subject. Here's an msdn article with sample code:
http://msdn2.microsoft.com/en-us/library/microsoft.wssux.reportingserviceswebservice.rsmanagementservice2005.reportingservice2005.createsubscription.aspx

In the code, you would perform whatever operation you needed to get the same information that the RDL is getting. Then instead of this line:
extensionParams(4).Value = "@.ReportName was executed at @.ExecutionTime"
you would use this line:
extensionParams(4).Value = "LWD Apps " & variableThatIJustRetrieved

You would set the report to run once and the time to run right then (or a minute in the future). You would also want to delete the subscription after it has run, so that you would have a clean slate for tomorrow's run.

It's definitely kludgy though. Does anyone else have any thoughts?
-Jessica

Thursday, March 8, 2012

Automated script generation

I often create scripts from SQL Server 2000 Enterprise Mgr the same way: I select all tables, check Indexes, check Constraints, check Windows text, then I go. Is it possible to automate this task further, for example running the scripting from a Stored Proc?RE: I often create scripts from SQL Server 2000 Enterprise Mgr the same way: I select all tables, check Indexes, check Constraints, check Windows text, then I go. Is it possible to automate this task further, for example running the scripting from a Stored Proc?

Q1 Is it possible to automate this task further, for example running the scripting from a Stored Proc?

A2 Yes, though for some tasks (particularly with previous versions) you may have to use cursors.|||I could, fairly quickly, write a SP that reads some system tables and do the scripting I need itself.

But, is there some way to make Enterprise Mgr to do it, without really having to write a program?|||Originally posted by Coolberg
I could, fairly quickly, write a SP that reads some system tables and do the scripting I need itself.

Well, not quickly I realize, because I need to put the ALTER TABLE ... DROP CONSTRAINT statements in the proper order.
Or could I do a ALTER TABLE ... NOCHECK ALL on all tables...?

Saturday, February 25, 2012

Automate a nightly query and email results...

Hi,
I was wondering if someone could help with my latest project.
I need to run a nightly query on our MS SQL 2000 DB, have the results saved
to a text file and then email the file to someone. I have no idea were to
begin with this one...help?
Thanks,
MitchYou can create a Data Transformation Services package to create the file
based on the query and then email the file. Schedule the package to run ever
y
night.
Data Transformation Services (DTS) in
Microsoft SQL Server 2000
http://msdn.microsoft.com/sql/sqlwa...ts_overview.asp
Data Transformation Services
http://msdn.microsoft.com/library/d...asp?frame=true
AMB
"mitch" wrote:

> Hi,
> I was wondering if someone could help with my latest project.
> I need to run a nightly query on our MS SQL 2000 DB, have the results save
d
> to a text file and then email the file to someone. I have no idea were to
> begin with this one...help?
> Thanks,
> Mitch
>
>|||I would do this using a DTS package or using BCP.
For example, let's go with a DTS package:
In the package, you will add an SQL Server source, and flat file destination
connections. You add a data pump task that connects these two connections,
and pumps out your query results to the flat file.
Then the next step will send an email by specifying the file name as the
attachment.
For sending emails, I use xp_smtp_sendmail, which is a free download from
http://sqldev.net
You could also use SQL Mail. See SQL Server Books Online for more
information.
if you are not familiar with DTS, start with Books Online, and play around
with it from the DTS designer in Enterprise manager. For DTS info, checkout
sqldts.com.
--
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"mitch" <mitch_001@.REMOVEyahoo.com> wrote in message
news:eUiUKWSnFHA.576@.TK2MSFTNGP15.phx.gbl...
> Hi,
> I was wondering if someone could help with my latest project.
> I need to run a nightly query on our MS SQL 2000 DB, have the results
> saved to a text file and then email the file to someone. I have no idea
> were to begin with this one...help?
> Thanks,
> Mitch
>|||Hi,
If u wanted to transfer all data to a text file then use bcp utility
ex:
exec master..xp_cmdshell 'bcp epinav.dbo.dlvmode out
D:\scriptrecordtable\dlvmode.bcp -n -"EPIOLAPP428G" -U -P'
epiolapp428g is the servername
for more help read books online just type BCP
if u wanted selected data then create a view and tranfer data from
view.
with the same as above.
for sending mail i refer to vyas solution.
to copy data from text file to ur sql table
use this
exec master..xp_cmdshell 'bcp epinav.dbo.custtable in
E:\RepScript\custtable.bcp -n -"aicml370" -U -P'
hope this help
from
killer|||Thanks everyone!!!
I started playing around with DTS and was able to setup a query and save the
info to a text file, but now I have 2 more question.
The information is saved in the text file row by row, BUT it is inserting a
space between each row, is there some way to remove the space?
Also, when a record is submitted it has to follow a YYYYMMDD format so I
placed a hidden text field in the form like this one:
<input name="source_date" type="hidden" id="source_date" value="<%
response.write YEAR(Date()) & _
Pd(Month(date()),2) & _
Pd(DAY(date()),2)
%>">
I am having some trouble in the query that runs in DTS to automatically
capture today's date and use it in the WHERE clause:
where [request_info_form].[source_date]='20050113'
Can you guys help me think of a way?
Thank you very much for all the help!
Mitch
"Narayana Vyas Kondreddi" <answer_me@.hotmail.com> wrote in message
news:%238YBLiSnFHA.2156@.TK2MSFTNGP14.phx.gbl...
>I would do this using a DTS package or using BCP.
> For example, let's go with a DTS package:
> In the package, you will add an SQL Server source, and flat file
> destination connections. You add a data pump task that connects these two
> connections, and pumps out your query results to the flat file.
> Then the next step will send an email by specifying the file name as the
> attachment.
> For sending emails, I use xp_smtp_sendmail, which is a free download from
> http://sqldev.net
> You could also use SQL Mail. See SQL Server Books Online for more
> information.
> if you are not familiar with DTS, start with Books Online, and play around
> with it from the DTS designer in Enterprise manager. For DTS info,
> checkout sqldts.com.
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "mitch" <mitch_001@.REMOVEyahoo.com> wrote in message
> news:eUiUKWSnFHA.576@.TK2MSFTNGP15.phx.gbl...
>|||Something like this should work...
select @.yourdate = (select convert(CHAR(8),getdate(),112))
This puts things in yyyymmdd format.|||Works perfectly!!! Thanks!
<unc27932@.yahoo.com> wrote in message
news:1123689630.540990.188690@.z14g2000cwz.googlegroups.com...
> Something like this should work...
> select @.yourdate = (select convert(CHAR(8),getdate(),112))
> This puts things in yyyymmdd format.
>

Automate a nightly query and email results...

Hi,
I was wondering if someone could help with my latest project.
I need to run a nightly query on our MS SQL 2000 DB, have the results saved
to a text file and then email the file to someone. I have no idea were to
begin with this one...help?
Thanks,
Mitch
You can create a Data Transformation Services package to create the file
based on the query and then email the file. Schedule the package to run every
night.
Data Transformation Services (DTS) in
Microsoft SQL Server 2000
http://msdn.microsoft.com/sql/sqlwar...s_overview.asp
Data Transformation Services
http://msdn.microsoft.com/library/de...asp?frame=true
AMB
"mitch" wrote:

> Hi,
> I was wondering if someone could help with my latest project.
> I need to run a nightly query on our MS SQL 2000 DB, have the results saved
> to a text file and then email the file to someone. I have no idea were to
> begin with this one...help?
> Thanks,
> Mitch
>
>
|||I would do this using a DTS package or using BCP.
For example, let's go with a DTS package:
In the package, you will add an SQL Server source, and flat file destination
connections. You add a data pump task that connects these two connections,
and pumps out your query results to the flat file.
Then the next step will send an email by specifying the file name as the
attachment.
For sending emails, I use xp_smtp_sendmail, which is a free download from
http://sqldev.net
You could also use SQL Mail. See SQL Server Books Online for more
information.
if you are not familiar with DTS, start with Books Online, and play around
with it from the DTS designer in Enterprise manager. For DTS info, checkout
sqldts.com.
HTH,
Vyas, MVP (SQL Server)
SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
"mitch" <mitch_001@.REMOVEyahoo.com> wrote in message
news:eUiUKWSnFHA.576@.TK2MSFTNGP15.phx.gbl...
> Hi,
> I was wondering if someone could help with my latest project.
> I need to run a nightly query on our MS SQL 2000 DB, have the results
> saved to a text file and then email the file to someone. I have no idea
> were to begin with this one...help?
> Thanks,
> Mitch
>
|||Hi,
If u wanted to transfer all data to a text file then use bcp utility
ex:
exec master..xp_cmdshell 'bcp epinav.dbo.dlvmode out
D:\scriptrecordtable\dlvmode.bcp -n -"EPIOLAPP428G" -U -P'
epiolapp428g is the servername
for more help read books online just type BCP
if u wanted selected data then create a view and tranfer data from
view.
with the same as above.
for sending mail i refer to vyas solution.
to copy data from text file to ur sql table
use this
exec master..xp_cmdshell 'bcp epinav.dbo.custtable in
E:\RepScript\custtable.bcp -n -"aicml370" -U -P'
hope this help
from
killer
|||Thanks everyone!!!
I started playing around with DTS and was able to setup a query and save the
info to a text file, but now I have 2 more question.
The information is saved in the text file row by row, BUT it is inserting a
space between each row, is there some way to remove the space?
Also, when a record is submitted it has to follow a YYYYMMDD format so I
placed a hidden text field in the form like this one:
<input name="source_date" type="hidden" id="source_date" value="<%
response.write YEAR(Date()) & _
Pd(Month(date()),2) & _
Pd(DAY(date()),2)
%>">
I am having some trouble in the query that runs in DTS to automatically
capture today's date and use it in the WHERE clause:
where [request_info_form].[source_date]='20050113'
Can you guys help me think of a way?
Thank you very much for all the help!
Mitch
"Narayana Vyas Kondreddi" <answer_me@.hotmail.com> wrote in message
news:%238YBLiSnFHA.2156@.TK2MSFTNGP14.phx.gbl...
>I would do this using a DTS package or using BCP.
> For example, let's go with a DTS package:
> In the package, you will add an SQL Server source, and flat file
> destination connections. You add a data pump task that connects these two
> connections, and pumps out your query results to the flat file.
> Then the next step will send an email by specifying the file name as the
> attachment.
> For sending emails, I use xp_smtp_sendmail, which is a free download from
> http://sqldev.net
> You could also use SQL Mail. See SQL Server Books Online for more
> information.
> if you are not familiar with DTS, start with Books Online, and play around
> with it from the DTS designer in Enterprise manager. For DTS info,
> checkout sqldts.com.
> --
> HTH,
> Vyas, MVP (SQL Server)
> SQL Server Articles and Code Samples @. http://vyaskn.tripod.com/
>
> "mitch" <mitch_001@.REMOVEyahoo.com> wrote in message
> news:eUiUKWSnFHA.576@.TK2MSFTNGP15.phx.gbl...
>
|||Something like this should work...
select @.yourdate = (select convert(CHAR(8),getdate(),112))
This puts things in yyyymmdd format.
|||Works perfectly!!! Thanks!
<unc27932@.yahoo.com> wrote in message
news:1123689630.540990.188690@.z14g2000cwz.googlegr oups.com...
> Something like this should work...
> select @.yourdate = (select convert(CHAR(8),getdate(),112))
> This puts things in yyyymmdd format.
>

Auto-Increasement field?

Hi!

I'm using Microsoft SQL Server Management Studio to design a table with two fields:

id (int)

file (text)

I set 'id' to be primary. I try to add a row to this table but it asks me for a custom value for 'id'. I want it simply to auto-assign a uniqe value for it. How to do this please?

In the table designer, set the Identity Specification to Is_Identity = Yes.

Also, I recommend NOT using [ID] as the column name. A good standard is to use the TableName and ID, so a table named MyTable would have it's IDENTITY column named MyTableID.

Thursday, February 16, 2012

Auto scripting SPs, perhaps concatenating text fields...

Hi,

I'm trying to get the text of all my SPs saved into text (*.sql) files. My first thought was to use sp_helptext and bcp the table to a text file but this isn't working (see my other post) so thought I'd try another method.

I can get the code from syscomment.text and concatenate the varchar(8000) field together in a text field. Unfortunately this isn't as easy as just text = text + newtext, how is this done?

Or am I doing it all comletely the wrong way? BTW, I have over 150 SPs so I can't save them individually.

Thanks!

Nick

Why don't you use EM or SSMS for that? Both utilities have feature to scrip database objects.|||

I've just found the "create one file per object" option. Thanks gavrilenko_s!

Nick

auto save query result to a text file

Is there an easy way to automate the following steps
from SQL server:
1. run a sql-query script (like "select * from ..." )
2. save the query result to a .txt or .rpt file in a
designated folder.
-i know i can write some c++ application to do this
but is there an easy way (like some simple script to be
run as scheduled task)
to do it ?
THANKS!The osql utility will do this for you. It can take a query as an input
parameter, and send its output to a file which is also specified as a
parameter.
The osql command can be called from a command file (.bat) so it can be
automated fully.
See the full osql syntax in Books Online.
--
HTH
--
Kalen Delaney
SQL Server MVP
www.SolidQualityLearning.com
"AMY" <anonymous@.discussions.microsoft.com> wrote in message
news:2bb7601c39331$37fc0650$a601280a@.phx.gbl...
> Is there an easy way to automate the following steps
> from SQL server:
> 1. run a sql-query script (like "select * from ..." )
> 2. save the query result to a .txt or .rpt file in a
> designated folder.
> -i know i can write some c++ application to do this
> but is there an easy way (like some simple script to be
> run as scheduled task)
> to do it ?
> THANKS!
>|||thank you Kalen, it's big help !
-amy
>--Original Message--
>The osql utility will do this for you. It can take a
query as an input
>parameter, and send its output to a file which is also
specified as a
>parameter.
>The osql command can be called from a command file (.bat)
so it can be
>automated fully.
>See the full osql syntax in Books Online.
>--
>HTH
>--
>Kalen Delaney
>SQL Server MVP
>www.SolidQualityLearning.com
>
>"AMY" <anonymous@.discussions.microsoft.com> wrote in
message
>news:2bb7601c39331$37fc0650$a601280a@.phx.gbl...
>> Is there an easy way to automate the following steps
>> from SQL server:
>> 1. run a sql-query script (like "select * from ..." )
>> 2. save the query result to a .txt or .rpt file in a
>> designated folder.
>> -i know i can write some c++ application to do this
>> but is there an easy way (like some simple script to be
>> run as scheduled task)
>> to do it ?
>> THANKS!
>>
>
>.
>

Sunday, February 12, 2012

Auto List Members

In SSMS I have "Auto List Members" and "Parameter Information" turned on for the text editor/all lanuguages but it does not work in the text editor when connected to a SQL Server 2000 DB. Any suggestion would be appreciates.SSMS does not support intellisense for the SQL Editor/TSQL. (as of this time) :)

auto increase text!

Hi,
I have an initial value like this:
'TEST0001'
I want to write a sql script that will automatically take that initial value
and keep adding 1 to it and insert into a table for me until it get to
'TEST9999'.
my table should now store:
TEST0001
TEST0002
...
...
TEST0010
TEST0011
...
...
TEST9999
How do I do this?
Thanks,
Tom ddeclare @.seed int
set @.seed = 1
while @.seed < 10000
begin
insert into tableName select 'TEST' + right('000' + cast(@.seed as
varchar(4)), 4)
set @.seed = @.seed + 1
end
"tom d" <tomd@.discussions.microsoft.com> wrote in message
news:1DB51C59-3F93-476E-91CE-6E7B4C7E7259@.microsoft.com...
> Hi,
> I have an initial value like this:
> 'TEST0001'
> I want to write a sql script that will automatically take that initial
> value
> and keep adding 1 to it and insert into a table for me until it get to
> 'TEST9999'.
> my table should now store:
> TEST0001
> TEST0002
> ...
> ...
> TEST0010
> TEST0011
> ...
> ...
> TEST9999
> How do I do this?
> Thanks,
> Tom d
>