LogParser "SELECT * Strings AS Parameters FROM Application WHERE EventID=20503
Technology Innovation - Blog para Programadores Esta é uma ideia que surgiu para divulgar as possíveis soluções que possam facilitar a vida aos programadores
quarta-feira, dezembro 05, 2012
LogParser - More Commands
Examples
Keep in mind that most of the examples that I give here are all-in-one command line queries (even though many wrap to multiple lines when displayed here). However, queries can also be run as
logparser file:XXXXX.sqlwhere XXXXX is the name of a file containing a logparser-friendly sql query. There are a couple examples of this in the following list.
The examples given here have been obtained from a variety of sources, including the documentation that ships with the tool, blogs and online documentation, and my own experience. Unfortunately, I don’t have a record of the origin of each individual example, as I’ve compiled these piecemeal over the last two or three years.
I hope you’ll find something useful here and gain an appreciation for just how robust this tool is.
1) All pages hits by a given IP address
logparser "select cs-uri-stem, count(cs-uri-stem) as requestcount from [LogFileName] where c-ip = ’000.00.00.000′ group by cs-uri-stem order by count(cs-uri-stem) desc"2) Hits on a particular page by IP address
logparser "select c-ip, count(c-ip) as requestcount from [LogFileName] where cs-uri-stem like ‘/search.aspx%’ group by c-ip order by count(c-ip) desc"3) ReverseDNS example. This attempts to find the domain associated with a given IP address.
logparser "select c-ip, REVERSEDNS(c-ip) from [LogFileName] where c-ip = ’000.00.00.000′ group by c-ip"4) CSV example. All hits on a page, written to a CVS file.
logparser "select * into OUTPUT.CSV from [LogFileName] where cs-uri-stem like ‘/pagename.aspx’"5) Chart example. All hits on a page by an IP address, displayed on a chart.
logparser "select c-ip, count(c-ip) as requestcount into logparserchart.gif from [LogFileName] where cs-uri-stem like ‘/pagename.aspx’ group by c-ip order by count(c-ip) desc" -o:chart6) Hits per hour from a particular IP address
logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.000′ group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600))"7) Basic list of IP addresses generating traffic
logparser "select c-ip, count(c-ip) as requestcount from [LogFileName] group by c-ip order by count(c-ip) desc"8) Basic list of pages being hit
logparser "select cs-uri-stem, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ group by cs-uri-stem order by count(cs-uri-stem) desc"9) Basic list of pages being hit, including which IPs are doing the hitting
logparser "select cs-uri-stem, c-ip, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ group by cs-uri-stem, c-ip order by count(cs-uri-stem) desc"10) Pages being hit after a specific date and time
logparser "select cs-uri-stem, c-ip, count(cs-uri-stem) from [LogFileName] where cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’ and date=’2009-06-04′ and time > ’15:00:00′ group by cs-uri-stem, c-ip order by count(cs-uri-stem) desc"11) Counts of hits of ASPX/ASHX pages by hour from a particular IP address
logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.00′ and (cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’) group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600))"12) Counts of hits against specific pages by hour from a particular IP address
logparser "select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)), cs-uri-stem, count(*) as numberrequests from [LogFileName] where c-ip=’000.000.00.00′ and (cs-uri-stem like ‘%aspx%’ or cs-uri-stem like ‘%ashx%’) group by TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date,time), 3600)), cs-uri-stem order by numberrequests desc"13) Top browsers
logparser "Select top 50 to_int(mul(100.0,PropCount(*))) as Percent, count(*) as TotalHits, cs(User-Agent) as Browser from [LogFileName] group by Browser order by Totalhits desc"14) Hourly Bandwidth (chart)
logparser "Select TO_LOCALTIME(QUANTIZE(TO_TIMESTAMP(date, time), 3600)) As Hour, Div(Sum(cs-bytes),1024) As Incoming(K), Div(Sum(sc-bytes),1024) As Outgoing(K) Into BandwidthByHour.gif From [LogFileName] Group By Hour"15) Requests by URI
logparser "SELECT top 80 QUANTIZE(TO_TIMESTAMP(date, time), 3600) as Hour, TO_LOWERCASE(STRCAT(‘/’,EXTRACT_TOKEN(cs-uri-stem,1,’/'))) as URI, COUNT(*) AS RequestsPerHour, SUM(sc-bytes) AS TotBytesSent, AVG(sc-bytes) AS AvgBytesSent, Max(sc-bytes) AS MaxBytesSent, ADD(1,DIV(Avg(time-taken),1000)) AS AvgTime, ADD(1,DIV(MAX(time-taken),1000)) AS MaxTime FROM [LogFileName] GROUP BY Hour, URI Having RequestsPerHour > 10 ORDER BY RequestsPerHour ASC"16) Top 10 Images by size
logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/') AS RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile, Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent From [LogFileName] Where (Extract_Extension(To_Lowercase(cs-uri-stem)) IN (‘gif’;'jpg’;'png’)) AND (sc-status = 200) Group By To_Lowercase(cs-uri-stem) Order By BytesSent, Hits, MaxTime DESC"17) Top 10 URLs for a website, with total hits, max time to serve, and average time to serve
logparser "Select TOP 10 STRCAT(EXTRACT_PATH(cs-uri-stem),’/') AS RequestPath, EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile, COUNT(*) AS TotalHits, Max(time-taken) AS MaxTime, AVG(time-taken) AS AvgTime, AVG(sc-bytes) AS AvgBytesSent FROM [LogFileName] GROUP BY cs-uri-stem ORDER BY TotalHits DESC"18) Top 20 clients
logparser "Select Top 20 c-ip AS Client, Count(*) AS Hits INTO Chart.gif FROM [LogFileName] GROUP BY c-ip ORDER BY Hits Desc"19) Referrer Broken Links (i.e. external references to broken links on your site)
logparser "SELECT DISTINCT cs(Referer) as Referer, cs-uri-stem as Url INTO ReferBrokenLinks.html FROM [LogFileName] WHERE cs(Referer) IS NOT NULL AND sc-status = 404 AND (sc-substatus IS NULL OR sc-substatus=0)" -tpl:ReferBrokenLinks.tpl20) Status codes
logparser "SELECT sc-status As Status, COUNT(*) As Number INTO StatusCodes.gif FROM <2> GROUP BY Status ORDER BY Status"21) Search the Event Log for W3SVC (IIS) log entries and color-coordinate as to Error, Warning, Information. This example writes the output of the query to an HTML file that is generated using a template file.
logparser "SELECT TimeGenerated,EventTypeName,Strings,Message,CASE EventTypeName WHEN ‘Error event’ THEN ‘RED’ WHEN ‘Warning event’ THEN ‘YELLOW’ WHEN ‘Information event’ THEN ‘WHITE’ ELSE ‘BLUE’ END As Color INTO file.html FROM System WHERE SourceName = ‘W3SVC’" -tpl:IISEventLogEntries.tplWhere IISEventLogEntries.tpl is a file that contains the following:
<LPHEADER>22) Upload Log Parser query results directly to a table in SQL Server
<HTML>
<HEAD>
<STYLE>
TD { font-family: Arial };
TH { font-family: Arial };
</STYLE> </HEAD> <BODY> <TABLE BORDERCOLOR="BLACK" BORDER="1" CELLPADDING="2" CELLSPACING="2">
<TR>
<TH COLSPAN=4 BGCOLOR="BLACK"><FONT COLOR=WHITE>New W3SVC Messages in System Event Log</FONT></TH>
</TR>
<TR>
<TH ALIGN=LEFT BGCOLOR="#C0C0C0">Time Generated</TH>
<TH ALIGN=LEFT BGCOLOR="#C0C0C0">Event Type</TH>
<TH ALIGN=LEFT BGCOLOR="#C0C0C0">Strings</TH>
<TH ALIGN=LEFT BGCOLOR="#C0C0C0">Message</TH>
</TR>
</LPHEADER> <LPBODY>
<TR bgCOLOR="%Color%">
<TD>%TimeGenerated%</TD>
<TD>%EventTypeName%</TD>
<TD>%Strings%</TD>
<TD>%Message%</TD>
</TR>
</LPBODY> </TABLE>
</BODY>
</HTML>
logparser "select * into LogTable from [LogFileName] where cs-uri-stem like ‘/folder/filename%’" -o:SQL -createTable:ON -server:[DatabaseServer] -database:[Database] -username:[SqlUser] -password:[SqlPassword]23) Top 10 images by size sent. Note that this example also shows how to query multiple log files at once.
logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/') AS RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile, Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent INTO TOP10ImagesBySize.txt FROM logs\iis\ex*.log WHERE (Extract_Extension(To_Lowercase(cs-uri-stem)) IN (‘gif’;'jpg’;'png’)) AND (sc-status = 200) GROUP BY To_Lowercase(cs-uri-stem) ORDER BY BytesSent, Hits, MaxTime DESC"24) Browser types (two different approaches)
logparser "SELECT distinct cs(User-Agent), count(*) as hits INTO useragentsalltypes.txt FROM logs\iis\ex*.log GROUP BY cs(user-agent) ORDER BY hits DESC"
logparser "SELECT TO_INT(MUL(100.0,PROPCOUNT(*))) AS Percent, COUNT(*) AS Hits, cs(User-Agent) as Browser INTO UseragentsHits.txt FROM logs\iis\ex*.log GROUP BY Browser ORDER BY HITS DESC"25) Unique visitors per day. This requires two queries. The first query selects from the IIS logs into a CSV file, and the second selects from that CSV file.
logparser "SELECT DISTINCT cs-username, date INTO tempUniqueVisitorsPerDay.csv FROM logs\iis\ex*.log WHERE cs-username <> NULL Group By Date, cs-username"
logparser "SELECT date, count(cs-username) as UniqueVisitors into test.txt FROM tempUniqueVisitorsPerDay.csv GROUP BY date"26) Top 10 largest ASPX pages.
logparser "Select Top 10 StrCat(Extract_Path(TO_Lowercase(cs-uri-stem)),’/') AS RequestedPath, Extract_filename(To_Lowercase(cs-uri-stem)) As RequestedFile, Count(*) AS Hits, Max(time-taken) As MaxTime, Avg(time-taken) As AvgTime, Max(sc-bytes) As BytesSent INTO top10pagesbysize.txt FROM logs\iis\ex*.log WHERE (Extract_Extension(To_Lowercase(cs-uri-stem)) IN (‘aspx’)) AND (sc-status = 200) GROUP BY To_Lowercase(cs-uri-stem) ORDER BY BytesSent, Hits, MaxTime DESC"27) Top 10 slowest ASPX pages
logparser "SELECT TOP 10 cs-uri-stem, max(time-taken) as MaxTime, avg(time-taken) as AvgTime INTO toptimetaken.txt FROM logs\iis\ex*.log WHERE extract_extension(to_lowercase(cs-uri-stem)) = ‘aspx’ GROUP BY cs-uri-stem ORDER BY MaxTime DESC"28) Top 10 slowest ASPX pages on a specific day
logparser "SELECT TOP 10 cs-uri-stem, max(time-taken) as MaxTime, avg(time-taken) as AvgTime INTO toptimetaken.txt FROM logs\iis\ex*.log WHERE extract_extension(to_lowercase(cs-uri-stem)) = ‘aspx’ AND TO_STRING(To_timestamp(date, time), ‘MMdd’)=’1003′ GROUP BY cs-uri-stem ORDER BY MaxTime DESC"29) Daily bandwidth
logparser "Select To_String(To_timestamp(date, time), ‘MM-dd’) As Day, Div(Sum(cs-bytes),1024) As Incoming(K), Div(Sum(sc-bytes),1024) As Outgoing(K) Into BandwidthByDay.gif From logs\iis\ex*.log Group By Day"30) Bandwidth by hour
logparser "SELECT QUANTIZE(TO_TIMESTAMP(date, time), 3600) AS Hour, SUM(sc-bytes) AS TotalBytesSent INTO BytesSentPerHour.gif FROM logs\iis\ex*.log GROUP BY Hour ORDER BY Hour"31) Average page load time per user
logparser "Select Top 20 cs-username AS UserName, AVG(time-taken) AS AvgTime, Count(*) AS Hits INTO AvgTimePerUser.txt FROM logs\iis\ex*.log WHERE cs-username IS NOT NULL GROUP BY cs-username ORDER BY AvgTime DESC"32) Ave page load time for a specific user
logparser "Select cs-username AS UserName, AVG(time-taken) AS AvgTime, Count(*) AS Hits INTO AvgTimeOnSpecificUser.txt FROM logs\iis\ex*.log WHERE cs-username = ‘CONTOSO\User1234’ GROUP BY cs-username"33) Error trends. This query is quite long, and is easier expressed in a text file than on the command line. So, Log Parser reads and executes the query contained in the specified text file.
logparser file:errortrend.sqlWhere errortrend.sql contains the following:
SELECT34) Win32 errors
TO_STRING(To_timestamp(date, time), ‘MMdd’) AS Day,
SUM(c200) AS 200s,
SUM(c206) AS 206s,
SUM(c301) AS 301s,
SUM(c302) AS 302s,
SUM(c304) AS 304s,
SUM(c400) AS 400s,
SUM(c401) AS 401s,
SUM(c403) AS 403s,
SUM(c404) AS 404s,
SUM(c500) AS 500s,
SUM(c501) AS 501s,
SUM(c502) AS 502s,
SUM(c503) AS 503s,
SUM(c504) AS 504s,
SUM(c505) AS 505s
USING
CASE sc-status WHEN 200 THEN 1 ELSE 0 END AS c200,
CASE sc-status WHEN 206 THEN 1 ELSE 0 END AS c206,
CASE sc-status WHEN 301 THEN 1 ELSE 0 END AS c301,
CASE sc-status WHEN 302 THEN 1 ELSE 0 END AS c302,
CASE sc-status WHEN 304 THEN 1 ELSE 0 END AS c304,
CASE sc-status WHEN 400 THEN 1 ELSE 0 END AS c400,
CASE sc-status WHEN 401 THEN 1 ELSE 0 END AS c401,
CASE sc-status WHEN 403 THEN 1 ELSE 0 END AS c403,
CASE sc-status WHEN 404 THEN 1 ELSE 0 END AS c404,
CASE sc-status WHEN 500 THEN 1 ELSE 0 END AS c500,
CASE sc-status WHEN 501 THEN 1 ELSE 0 END AS c501,
CASE sc-status WHEN 502 THEN 1 ELSE 0 END AS c502,
CASE sc-status WHEN 503 THEN 1 ELSE 0 END AS c503,
CASE sc-status WHEN 504 THEN 1 ELSE 0 END AS c504,
CASE sc-status WHEN 505 THEN 1 ELSE 0 END AS c505
INTO ErrorChart.gif
FROM
logs\iis\ex*.log
GROUP BY
Day
ORDER BY
Day
logparser "SELECT sc-win32-status as ErrorNumber, WIN32_ERROR_DESCRIPTION(sc-win32-status) as ErrorDesc, Count(*) AS Total INTO Win32ErrorNumbers.txt FROM logs\iis\ex*.log WHERE sc-win32-status>0 GROUP BY ErrorNumber ORDER BY Total DESC"35) Substatus codes
logparser "SELECT sc-status, sc-substatus, Count(*) AS Total INTO 401subcodes.txt FROM logs\iis\ex*.log WHERE sc-status=401 GROUP BY sc-status, sc-substatus ORDER BY sc-status, sc-substatus DESC"36) Substatus codes per day. This is another example of executing a query contained in a text file.
logparser file:substatusperday.sqlWhere substatusperday.sql contains the following:
SELECT37) Substatus codes per page
TO_STRING(To_timestamp(date, time), ‘MMdd’) AS Day,
SUM(c1) AS 4011,
SUM(c2) AS 4012,
SUM(c3) AS 4013,
SUM(c4) AS 4014,
SUM(c5) AS 4015,
SUM(c7) AS 4017
USING
CASE sc-substatus WHEN 1 THEN 1 ELSE 0 END AS c1,
CASE sc-substatus WHEN 2 THEN 1 ELSE 0 END AS c2,
CASE sc-substatus WHEN 3 THEN 1 ELSE 0 END AS c3,
CASE sc-substatus WHEN 4 THEN 1 ELSE 0 END AS c4,
CASE sc-substatus WHEN 5 THEN 1 ELSE 0 END AS c5,
CASE sc-substatus WHEN 7 THEN 1 ELSE 0 END AS c7
INTO
401subcodesperday.txt
FROM
logs\iis\ex*.log
WHERE
sc-status=401
GROUP BY
Day
ORDER BY
Day
logparser "SELECT TOP 20 cs-uri-stem, sc-status, sc-substatus, Count(*) AS Total INTO 401Pagedetails.txt FROM logs\iis\ex*.log WHERE sc-status=401 GROUP BY cs-uri-stem, sc-status, sc-substatus ORDER BY Total"38) MB sent per HTTP status code
logparser "SELECT EXTRACT_EXTENSION(cs-uri-stem) AS PageType, SUM(sc-bytes) as TotalBytesSent, TO_INT(MUL(PROPSUM(sc-bytes), 100.0)) AS PercentBytes INTO PagesWithLargestBytesSent.htm FROM logs\iis\ex*.log GROUP BY Pagetype ORDER BY PercentBytes DESC"39) 500 errors per ASPX and Domain User
logparser "SELECT cs-username, cs-uri-stem, count(*) as Times INTO 500PagesByUserAndPage.txt FROM logs\iis\ex*.log WHERE sc-status=500 GROUP BY cs-username, cs-uri-stem ORDER BY Times DESC"40) Percent of 500 errors caused by each user
logparser "SELECT cs-username, count(*) as Times, propcount(*) as Percent INTO 500ErrorsByUser.csv FROM logs\iis\ex*.log WHERE sc-status=500 GROUP BY cs-username ORDER BY Times DESC"41) Determine what percentage of the total bytes sent are being caused by each page type
logparser "SELECT EXTRACT_EXTENSION(cs-uri-stem) AS PageType, SUM(sc-bytes) as TotalBytesSent, TO_INT(MUL(PROPSUM(sc-bytes), 100.0)) AS PercentBytes INTO PagesWithLargestBytesSent.txt FROM logs\iis\ex*.log GROUP BY Pagetype ORDER BY PercentBytes DESC"42) Top 20 pages with a specific HTTP return code
logparser "SELECT TOP 20 cs-uri-stem, sc-status, Count(*) AS Total INTO TOP20PagesWith401.txt FROM logs\iis\ex*.log WHERE TO_LOWERCASE(cs-uri-stem) LIKE ‘%.aspx’ and sc-status=401 GROUP BY cs-uri-stem, sc-status ORDER BY Total, cs-uri-stem, sc-status DESC"43) Check traffic from IP addresses
logparser "Select c-ip AS Client, Div(Sum(cs-bytes),1024) As IncomingBytes(K), Div(Sum(sc-bytes),1024) As OutgoingBytes(K), MAX(time-taken) as MaxTime, AVG(time-taken) as AvgTime, count(*) as hits INTO errorsperip.txt FROM logs\iis\ex*.log GROUP BY client ORDER BY Hits DESC"44) Check errors by IP address
logparser file:errorbyip.sqlWhere errorbyip.sql contains the following:
Select45) Find broken links
c-ip AS Client,
SUM(c400) AS 400s,
sum(c401) AS 401s,
SUM(c403) AS 403s,
SUM(c404) AS 404s,
SUM(c500) AS 500s,
SUM(c501) AS 501s,
SUM(c502) AS 502s,
SUM(c503) AS 503s,
SUM(c504) AS 504s,
SUM(c505) AS 505s
USING
CASE sc-status WHEN 400 THEN 1 ELSE 0 END AS c400,
CASE sc-status WHEN 401 THEN 1 ELSE 0 END AS c401,
CASE sc-status WHEN 403 THEN 1 ELSE 0 END AS c403,
CASE sc-status WHEN 404 THEN 1 ELSE 0 END AS c404,
CASE sc-status WHEN 500 THEN 1 ELSE 0 END AS c500,
CASE sc-status WHEN 501 THEN 1 ELSE 0 END AS c501,
CASE sc-status WHEN 502 THEN 1 ELSE 0 END AS c502,
CASE sc-status WHEN 503 THEN 1 ELSE 0 END AS c503,
CASE sc-status WHEN 504 THEN 1 ELSE 0 END AS c504,
CASE sc-status WHEN 505 THEN 1 ELSE 0 END AS c505
INTO
IPNumberFileName.txt
FROM
logs\iis\ex*.log
WHERE
c-ip=’<IP address goes here>’
GROUP BY
client
logparser "SELECT DISTINCT cs(Referer) as Referer, cs-uri-stem as Url INTO ReferBrokenLinks.txt FROM logs\iis\ex*.log WHERE cs(Referer) IS NOT NULL AND sc-status=404 AND (sc-substatus IS NULL OR sc-substatus=0)"46) Top 10 pages with most hits
logparser "Select TOP 10 STRCAT(EXTRACT_PATH(cs-uri-stem),’/') AS RequestPath, EXTRACT_FILENAME(cs-uri-stem) AS RequestedFile, COUNT(*) AS TotalHits, Max(time-taken) AS MaxTime, AVG(time-taken) AS AvgTime, AVG(sc-bytes) AS AvgBytesSent INTO Top10Urls.txt FROM logs\iis\ex*.log GROUP BY cs-uri-stem ORDER BY TotalHits DESC"47) Unique users per browser type (requires two queries)
logparser "SELECT DISTINCT cs-username, cs(user-agent) INTO UserAgentsUniqueUsers1.csv FROM logs\iis\ex*.log WHERE cs-username <> NULL GROUP BY cs-username, cs(user-agent)"
logparser "SELECT cs(user-agent), count(cs-username) as UniqueUsersPerAgent, TO_INT(MUL(PROPCOUNT(*), 100)) AS Percentage INTO UniqueUsersPerAgent.txt FROM UserAgentsUniqueUsers1.csv GROUP BY cs(user-agent) ORDER BY UniqueUsersPerAgent DESC"48) Bytes sent per file extension
logparser "SELECT EXTRACT_EXTENSION( cs-uri-stem ) AS Extension, MUL(PROPSUM(sc-bytes),100.0) AS PercentageOfBytes, Div(Sum(sc-bytes),1024) as AmountOfMbBytes INTO BytesPerExtension.txt FROM logs\iis\ex*.log GROUP BY Extension ORDER BY PercentageOfBytes DESC"49) Domains referring traffic to your site
logparser "SELECT EXTRACT_TOKEN(cs(Referer), 2, ‘/’) AS Domain, COUNT(*) AS [Requests] INTO ReferringDomains.txt FROM logs\iis\ex*.log GROUP BY Domain ORDER BY Requests DESC"50) OS types (requires two queries)
logparser "SELECT DISTINCT c-ip, cs(user-agent) INTO UserAgentsUniqueUsers.csv FROM logs\iis\ex*.log WHERE c-ip <> NULL GROUP BY c-ip, cs(user-agent)"
logparser file:getos.sqlWhere getos.sql contains the following:
SELECT51) Get timeout errors from the server Event Log. Display results in a datagrid.
SUM (c70) AS Win7,
SUM (c60) AS Vista,
SUM (c52) AS Win2003,
SUM (c51) AS WinXP,
SUM (C50) AS Win2000,
SUM (W98) AS Win98,
SUM (W95) AS Win95,
SUM (W9x) AS Win9x,
SUM (NT4) AS WinNT4,
SUM (OSX) AS OS-X,
SUM (Mac) AS Mac-,
SUM (PPC) AS Mac-PPC,
SUM (Lnx) AS Linux
USING
CASE strcnt(cs(User-Agent),’Windows+NT+6.1′) WHEN 1 THEN 1 ELSE 0 END AS C70,
CASE strcnt(cs(User-Agent),’Windows+NT+6.0′) WHEN 1 THEN 1 ELSE 0 END AS C60,
CASE strcnt(cs(User-Agent),’Windows+NT+5.2′) WHEN 1 THEN 1 ELSE 0 END AS C52,
CASE strcnt(cs(User-Agent),’Windows+NT+5.1′) WHEN 1 THEN 1 ELSE 0 END AS C51,
CASE strcnt(cs(User-Agent),’Windows+NT+5.0′) WHEN 1 THEN 1 ELSE 0 END AS C50,
CASE strcnt(cs(User-Agent),’Win98′) WHEN 1 THEN 1 ELSE 0 END AS W98,
CASE strcnt(cs(User-Agent),’Win95′) WHEN 1 THEN 1 ELSE 0 END AS W95,
CASE strcnt(cs(User-Agent),’Win+9x+4.90′) WHEN 1 THEN 1 ELSE 0 END AS W9x,
CASE strcnt(cs(User-Agent),’Winnt4.0′) WHEN 1 THEN 1 ELSE 0 END AS NT4,
CASE strcnt(cs(User-Agent),’OS+X’) WHEN 1 THEN 1 ELSE 0 END AS OSX,
CASE strcnt(cs(User-Agent),’Mac’) WHEN 1 THEN 1 ELSE 0 END AS Mac,
CASE strcnt(cs(User-Agent),’PPC’) WHEN 1 THEN 1 ELSE 0 END AS PPC,
CASE strcnt(cs(User-Agent),’Linux’) WHEN 1 THEN 1 ELSE 0 END AS Lnx
INTO
GetOSUsed.txt
FROM
UserAgentsUniqueUsers.csv
logparser "select * from \\servername\application where message like ‘%timeout expired%’" -i:EVT -o:datagrid52) Get exceptions from the server Event (Application) Log
logparser "select timegenerated, eventtypename, eventcategoryname, message into webserverlog.csv from \\servername\application where message like ‘%myapplication%exception%’" -i:EVT
Fonte: http://mlichtenberg.wordpress.com/2011/02/03/log-parser-rocks-more-than-50-examples/
terça-feira, dezembro 04, 2012
How Transactional Replication Works
Transactional replication is implemented by the SQL Server Snapshot Agent, Log Reader Agent, and Distribution Agent. The Snapshot Agent prepares snapshot files containing schema and data of published tables and database objects, stores the files in the snapshot folder, and records synchronization jobs in the distribution database on the Distributor.
The Log Reader Agent monitors the transaction log of each database configured for transactional replication and copies the transactions marked for replication from the transaction log into the distribution database, which acts as a reliable store-and-forward queue. The Distribution Agent copies the initial snapshot files from the snapshot folder and the transactions held in the distribution database tables to Subscribers.
Incremental changes made at the Publisher flow to Subscribers according to the schedule of the Distribution Agent, which can run continuously for minimal latency, or at scheduled intervals. Because changes to the data must be made at the Publisher (when transactional replication is used without immediate updating or queued updating options), update conflicts are avoided. Ultimately, all Subscribers will achieve the same values as the Publisher. If immediate updating or queued updating options are used with transactional replication, updates can be made at the Subscriber, and with queued updating, conflicts might occur. For more information, see How Updatable Subscriptions Work.
URL: http://msdn.microsoft.com/en-us/library/ms151706(v=sql.105).aspx
The Log Reader Agent monitors the transaction log of each database configured for transactional replication and copies the transactions marked for replication from the transaction log into the distribution database, which acts as a reliable store-and-forward queue. The Distribution Agent copies the initial snapshot files from the snapshot folder and the transactions held in the distribution database tables to Subscribers.
Incremental changes made at the Publisher flow to Subscribers according to the schedule of the Distribution Agent, which can run continuously for minimal latency, or at scheduled intervals. Because changes to the data must be made at the Publisher (when transactional replication is used without immediate updating or queued updating options), update conflicts are avoided. Ultimately, all Subscribers will achieve the same values as the Publisher. If immediate updating or queued updating options are used with transactional replication, updates can be made at the Subscriber, and with queued updating, conflicts might occur. For more information, see How Updatable Subscriptions Work.
URL: http://msdn.microsoft.com/en-us/library/ms151706(v=sql.105).aspx
sexta-feira, novembro 30, 2012
IIS Recycle automático
Como configurar limites de memória nas APP Pools do IIS
Sempre que se configura um site de IIS é conveniente proceder à limitação da memória em uso por essa AppPool de forma a que sempre que ultrapassar esse limite vai forçar a matar os processos pendentes desse site, ou seja, matará falhas derivadas de erros de programação como processos que não têm o kill correpondente e ou não é feito o recycle dentro da própria aplicação/site.
========================================================
The same setting hasn't been changed since IIS 6.0.
In IIS mmc, click Application Pools. Right-click a pool and select Advanced
Settings. In Recycling section, you will see there are two settings:
Private Memory Limit (KB) and Virtual Memory Limit (KB) .
When any of these 2 settings are set, if the worker process exceeds the
private or virtual memory quota, IIS will recycle that pool which limits
the memory usage.
========================================================
Dificuldades já conhecidas:
All the limits in the application pool are for bad behaving apps. And more specifically:
- To prevent the bad app from disturbing to good apps.
- To try and keep the bad app running as much as possible.
If your application is leaking then without a limit it will crash around 1.2 - 1.6 Gb (if memory serves). So 1 Gb is sensible. If during normal operation your application consume not more the 100 Mb and you have many app pool on the server, than you should set the limit lower to prevent one app from damaging other apps.
To conclude. 1 Gb is sensible. Hitting the limits should be treated as application crash and debugged and fixed.
Want to learn more:
http://blogs.msdn.com/b/pfedev/archive/2009/01/22/memory-based-recycling-in-iis-6-0.aspx
segunda-feira, novembro 19, 2012
How to use IIS Logs with LOGPARSER
Exemplo de pesquisa do logparser para logs de iis
LogParser -i:IISW3C "SELECT * FROM \\servername\LogsConsolidation\Repository\W3C3\*.log WHERE cs-uri-stem '/imagesSite/cover.jpg' " -o:CSV > C:\HitsCovers.csv
Resultado:
LogParser -i:IISW3C "SELECT * FROM \\servername\LogsConsolidation\Repository\W3C3\*.log WHERE cs-uri-stem '/imagesSite/cover.jpg' " -o:CSV > C:\HitsCovers.csv
Resultado:
Outro exemplo agora com datas:
LogParser -i:IISW3C "SELECT * FROM \\ServerName\LogsConsolidation\Repository\W3C3\*.log WHERE cs-uri-stem = '/imagesServerName/imgCovers.jpg' AND (DATE >= '2012-11-16' AND DATE <= '2012-11-23') " -o:CSV > C:\HitsCovers_Nov16_23.csv
How to use LogParser for Windows
O LogParser é uma tool potente de utilização em Windows para fazer pesquisa em Logs de IIS ou outros.
Tem parametrizações especificas para vários tipos de Logs.
Sites com info:
http://technet.microsoft.com/pt-br/library/cc779255(v=ws.10).aspx
http://technet.microsoft.com/en-us/scriptcenter/dd919274.aspx
http://support.microsoft.com/kb/910447
http://en.wikipedia.org/wiki/Logparser
Exemplos vários para LogParser:
Comando LOG PARSER
==> CSV
logparser "SELECT Count(EventID) AS Contador, ComputerName, EventID, EventTypeName, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent, Strings from *.evt to EventViewerVDFAPPs_v2.csv WHERE TimeGenerated > timestamp('01-03-2011', 'dd-MM-yyyy') AND Strings
like'%Fail to obtain widget menu%' GROUP BY ComputerName, EventID, EventTypeName, DateEvent, Strings ORDER BY DateEvent Desc" -o:CSV
==> XML
logparser "SELECT Count(EventID) AS Contador, ComputerName, EventID, EventTypeName, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent, Strings from *.evt to EventViewerVDFAPPs_v2.xml WHERE TimeGenerated > timestamp('01-03-2011', 'dd-MM-yyyy') AND Strings
like'%exception%' GROUP BY ComputerName, EventID, EventTypeName, DateEvent,Strings ORDER BY DateEvent Desc" -o:XML
-- PPI: Comando para ficheiros log de IIS e apanhar webservices e erros 500logparser "SELECT * FROM *.log to pires500.csv WHERE TEXT LIKE '%GETGROUPGRANTEDKEYS%' AND TEXT LIKE '% 500 %' " -o:csv
logparser ComputerName, EventId, EventType, SourceName, count(*) From 'STG-ASVR-*.Application.evt' to 'EventLogResult.csv' group by computername, eventId, EventType, SourceName Order By computername, eventtype, eventid" -o:CSV
logparser "Select ComputerName,to_string(TimeGenerated,'yyyy-MM-dd') as Day, EventTypeName, count(*) From 'STG-ASVR-*.Application.evt' to 'EventLogResult.csv' group by computername, Day, EventTypeName Order By computername, Day, eventtypeName" -o:CSV
-- PPI: Obtem os dados do EventViewer para um CSV do Evento 16571 (CCSKIPS)logparser "SELECT Count(EventID) AS Contador, ComputerName, EventID, EventTypeName, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent from mon-asvr-*_App.evt to EventViewerMON_ASVR.csv WHERE EventID = 16571 GROUP BY ComputerName, EventID, EventTypeName,
DateEvent ORDER BY DateEvent Desc" -o:CSV
-- PPI: Obtem dados dentro de um determinado período
logparser "SELECT ComputerName, EventID, EventTypeName, Strings, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent from CAX-DSVR-02_App.evt WHERE TimeGenerated < timestamp('30-09-2008','dd-MM-yyyy') AND TimeGenerated > timestamp('20-09-2008', 'dd-MM-yyyy')
GROUP BY ComputerName, EventID, EventTypeName, Strings, DateEvent ORDER BY DateEvent Desc"
logparser "SELECT ComputerName, EventID, EventTypeName, Strings, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent from CAX-DSVR-02_App.evt WHERE TimeGenerated < timestamp('30-09-2008','dd-MM-yyyy') AND TimeGenerated > timestamp('20-09-2008', 'dd-MM-yyyy')
GROUP BY ComputerName, EventID, EventTypeName, Strings, DateEvent ORDER BY DateEvent Desc"
-- PPI: Obter dados do Strings também (sem o Enter > falta descobrir o char)logparser "SELECT ComputerName, EventID, EventTypeName, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent, REPLACE_CHR(Message, '£', '') AS Message from CAX-DSVR-02_App.evt to Pesquisa.csv WHERE TimeGenerated < timestamp('30-09-2008','dd-MM-yyyy') AND
TimeGenerated > timestamp('20-09-2008', 'dd-MM-yyyy') GROUP BY ComputerName, EventID, EventTypeName, Message, DateEvent ORDER BY DateEvent Desc" -o:CSV
-- PPI: Obter dados de servidoreslogparser "SELECT ComputerName, EventId, EventTypeName, to_string(TimeGenerated, 'yyyy-MM-dd hh:mm') AS EventDate, Count(1) as CountEvents FROM EventLog_20081015.evt where EventId = 14206 GROUP BY ComputerName, EventID, EventTypeName, EventDate ORDER BY
EventDate DESC"
-- PPI: Looking for IP
logparser "SELECT ComputerName, EventId, EventTypeName, to_string(TimeGenerated, 'yyyy-MM-dd hh:mm') AS EventDate, Count(1) as CountEvents FROM EventLog_20081015.evt where EventId = 14206 and Strings like '%10.193.219.107%' GROUP BY ComputerName, EventID,
EventTypeName, EventDate ORDER BY EventDate DESC"
logparser "SELECT ComputerName, EventId, EventTypeName, to_string(TimeGenerated, 'yyyy-MM-dd hh:mm') AS EventDate, Count(1) as CountEvents FROM EventLog_20081015.evt where EventId = 14206 and Strings not like '%10.193.219.107%' GROUP BY ComputerName, EventID,
EventTypeName, EventDate ORDER BY EventDate DESC"
-- PPI: Para um determinado EventIDlogparser "SELECT UserName, ComputerName, EventId, EventTypeName, to_string(TimeGenerated, 'yyyy-MM-dd hh:mm') AS EventDate FROM *.evt to 'EventsSFB01_12994' where EventId = 12994 ORDER BY EventDate DESC" -o:csv
logparser "SELECT ComputerName, EventId, EventTypeName, to_string(TimeGenerated, 'yyyy-MM-dd hh:mm') AS EventDate, Count(1) as CountEvents FROM *.evt to 'EventsSFB01_12994' where EventId = 12994 GROUP BY ComputerName, EventID, EventTypeName, EventDate ORDER BY
EventDate DESC" -o:csv
-- PPI: Para obter uma lista em XML de uma determinada string em Logs de IISLogParser -i:TEXTLINE "SELECT * FROM \\Alfr-log-01\LogsConsolidation\Repository\W3C3\*.log WHERE Text like '%PromoFriends.jpg%'" -o:xml > C:\Oper\LogParser\WIDGETS\PromoFriends2.xml
Tem parametrizações especificas para vários tipos de Logs.
Sites com info:
http://technet.microsoft.com/pt-br/library/cc779255(v=ws.10).aspx
http://technet.microsoft.com/en-us/scriptcenter/dd919274.aspx
http://support.microsoft.com/kb/910447
http://en.wikipedia.org/wiki/Logparser
Exemplos vários para LogParser:
Comando LOG PARSER
==> CSV
logparser "SELECT Count(EventID) AS Contador, ComputerName, EventID, EventTypeName, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent, Strings from *.evt to EventViewerVDFAPPs_v2.csv WHERE TimeGenerated > timestamp('01-03-2011', 'dd-MM-yyyy') AND Strings
like'%Fail to obtain widget menu%' GROUP BY ComputerName, EventID, EventTypeName, DateEvent, Strings ORDER BY DateEvent Desc" -o:CSV
==> XML
logparser "SELECT Count(EventID) AS Contador, ComputerName, EventID, EventTypeName, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent, Strings from *.evt to EventViewerVDFAPPs_v2.xml WHERE TimeGenerated > timestamp('01-03-2011', 'dd-MM-yyyy') AND Strings
like'%exception%' GROUP BY ComputerName, EventID, EventTypeName, DateEvent,Strings ORDER BY DateEvent Desc" -o:XML
-- PPI: Comando para ficheiros log de IIS e apanhar webservices e erros 500logparser "SELECT * FROM *.log to pires500.csv WHERE TEXT LIKE '%GETGROUPGRANTEDKEYS%' AND TEXT LIKE '% 500 %' " -o:csv
logparser ComputerName, EventId, EventType, SourceName, count(*) From 'STG-ASVR-*.Application.evt' to 'EventLogResult.csv' group by computername, eventId, EventType, SourceName Order By computername, eventtype, eventid" -o:CSV
logparser "Select ComputerName,to_string(TimeGenerated,'yyyy-MM-dd') as Day, EventTypeName, count(*) From 'STG-ASVR-*.Application.evt' to 'EventLogResult.csv' group by computername, Day, EventTypeName Order By computername, Day, eventtypeName" -o:CSV
-- PPI: Obtem os dados do EventViewer para um CSV do Evento 16571 (CCSKIPS)logparser "SELECT Count(EventID) AS Contador, ComputerName, EventID, EventTypeName, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent from mon-asvr-*_App.evt to EventViewerMON_ASVR.csv WHERE EventID = 16571 GROUP BY ComputerName, EventID, EventTypeName,
DateEvent ORDER BY DateEvent Desc" -o:CSV
-- PPI: Obtem dados dentro de um determinado período
logparser "SELECT ComputerName, EventID, EventTypeName, Strings, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent from CAX-DSVR-02_App.evt WHERE TimeGenerated < timestamp('30-09-2008','dd-MM-yyyy') AND TimeGenerated > timestamp('20-09-2008', 'dd-MM-yyyy')
GROUP BY ComputerName, EventID, EventTypeName, Strings, DateEvent ORDER BY DateEvent Desc"
logparser "SELECT ComputerName, EventID, EventTypeName, Strings, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent from CAX-DSVR-02_App.evt WHERE TimeGenerated < timestamp('30-09-2008','dd-MM-yyyy') AND TimeGenerated > timestamp('20-09-2008', 'dd-MM-yyyy')
GROUP BY ComputerName, EventID, EventTypeName, Strings, DateEvent ORDER BY DateEvent Desc"
-- PPI: Obter dados do Strings também (sem o Enter > falta descobrir o char)logparser "SELECT ComputerName, EventID, EventTypeName, to_string(TimeGenerated,'yyyy-MM-dd hh:mm') AS DateEvent, REPLACE_CHR(Message, '£', '') AS Message from CAX-DSVR-02_App.evt to Pesquisa.csv WHERE TimeGenerated < timestamp('30-09-2008','dd-MM-yyyy') AND
TimeGenerated > timestamp('20-09-2008', 'dd-MM-yyyy') GROUP BY ComputerName, EventID, EventTypeName, Message, DateEvent ORDER BY DateEvent Desc" -o:CSV
-- PPI: Obter dados de servidoreslogparser "SELECT ComputerName, EventId, EventTypeName, to_string(TimeGenerated, 'yyyy-MM-dd hh:mm') AS EventDate, Count(1) as CountEvents FROM EventLog_20081015.evt where EventId = 14206 GROUP BY ComputerName, EventID, EventTypeName, EventDate ORDER BY
EventDate DESC"
-- PPI: Looking for IP
logparser "SELECT ComputerName, EventId, EventTypeName, to_string(TimeGenerated, 'yyyy-MM-dd hh:mm') AS EventDate, Count(1) as CountEvents FROM EventLog_20081015.evt where EventId = 14206 and Strings like '%10.193.219.107%' GROUP BY ComputerName, EventID,
EventTypeName, EventDate ORDER BY EventDate DESC"
logparser "SELECT ComputerName, EventId, EventTypeName, to_string(TimeGenerated, 'yyyy-MM-dd hh:mm') AS EventDate, Count(1) as CountEvents FROM EventLog_20081015.evt where EventId = 14206 and Strings not like '%10.193.219.107%' GROUP BY ComputerName, EventID,
EventTypeName, EventDate ORDER BY EventDate DESC"
-- PPI: Para um determinado EventIDlogparser "SELECT UserName, ComputerName, EventId, EventTypeName, to_string(TimeGenerated, 'yyyy-MM-dd hh:mm') AS EventDate FROM *.evt to 'EventsSFB01_12994' where EventId = 12994 ORDER BY EventDate DESC" -o:csv
logparser "SELECT ComputerName, EventId, EventTypeName, to_string(TimeGenerated, 'yyyy-MM-dd hh:mm') AS EventDate, Count(1) as CountEvents FROM *.evt to 'EventsSFB01_12994' where EventId = 12994 GROUP BY ComputerName, EventID, EventTypeName, EventDate ORDER BY
EventDate DESC" -o:csv
-- PPI: Para obter uma lista em XML de uma determinada string em Logs de IISLogParser -i:TEXTLINE "SELECT * FROM \\Alfr-log-01\LogsConsolidation\Repository\W3C3\*.log WHERE Text like '%PromoFriends.jpg%'" -o:xml > C:\Oper\LogParser\WIDGETS\PromoFriends2.xml
quarta-feira, novembro 14, 2012
Most Common FTP errors
Code | Description |
---|---|
100 Codes | The requested action is being taken. Expect a reply before proceeding with a new command. |
110 | Restart marker reply. |
120 | Service ready in (n) minutes. |
125 | Data connection already open, transfer starting. |
150 | File status okay, about to open data connection. |
200 Codes | The requested action has been successfully completed. |
200 | Command okay. |
202 | Command not implemented |
211 | System status, or system help reply. |
212 | Directory status. |
213 | File status. |
214 | Help message. |
215 | NAME system type. (NAME is an official system name from the list in the Assigned Numbers document.) |
220 | Service ready for new user. |
221 | Service closing control connection. (Logged out if appropriate.) |
225 | Data connection open, no transfer in progress. |
226 | Closing data connection. Requested file action successful (file transfer, abort, etc.). |
227 | Entering Passive Mode |
230 | User logged in, proceed. |
250 | Requested file action okay, completed. |
257 | "PATHNAME" created. |
300 Codes | The command has been accepted, but the requested action is being held pending receipt of further information. |
331 | User name okay, need password. |
332 | Need account for login. |
350 | Requested file action pending further information. |
400 Codes | The command was not accepted and the requested action did not take place. The error condition is temporary, however, and the action may be requested again. |
421 | Service not available, closing control connection. (May be a reply to any command if the service knows it must shut down.)` |
425 | Can't open data connection. |
426 | Connection closed, transfer aborted. |
450 | Requested file action not taken. File unavailable (e.g., file busy). |
451 | Requested action aborted, local error in processing. |
452 | Requested action not taken. Insufficient storage space in system. |
500 Codes | The command was not accepted and the requested action did not take place. |
500 | Syntax error, command unrecognized. This may include errors such as command line too long. |
501 | Syntax error in parameters or arguments. |
502 | Command not implemented. |
503 | Bad sequence of commands. |
504 | Command not implemented for that parameter. |
530 | User not logged in. |
532 | Need account for storing files. |
550 | Requested action not taken. File unavailable (e.g., file not found, no access). |
552 | Requested file action aborted, storage allocation exceeded |
553 | Requested action not taken. Illegal file name. |
http://www.theegglestongroup.com/writing/ftp_error_codes.php
segunda-feira, novembro 12, 2012
Lista de ocupação por diretorias
Quando em DOS temos problemas em obter uma lista da ocupação de diretorias, poder-se-á usar o seguinte script e comando:
diskusage.bat [dir] /L > ficOutput.txt
=============================================
@ECHO OFF
CLS
:: Use local variables
IF "%OS%"=="Windows_NT" SETLOCAL
:: Check command line arguments and Windows version
ECHO.%1 | FIND "/" >NUL
IF NOT ERRORLEVEL 1 IF /I NOT "%~1"=="/L" GOTO Syntax
ECHO.%1 | FIND "?" >NUL
IF NOT ERRORLEVEL 1 GOTO Syntax
ECHO.%1 | FIND "*" >NUL
IF NOT ERRORLEVEL 1 GOTO Syntax
IF NOT "%OS%"=="Windows_NT" GOTO Syntax
IF NOT "%~1"=="" IF /I NOT "%~1"=="/L" IF NOT EXIST "%~1" GOTO Syntax
SET LongFormat=1
IF /I NOT "%~1"=="/L" IF /I NOT "%~2"=="/L" SET LongFormat=0
:: Go to start directory
SET StartDir=%CD%
IF NOT "%~1"=="" IF /I NOT "%~1"=="/L" SET StartDir=%~1
PUSHD "%StartDir%"
IF ERRORLEVEL 1 GOTO Syntax
:: Display header
ECHO Directory Space used (MB)
ECHO.========= ===============
:: Display disk usage for start directory
IF NOT EXIST *.* GOTO SubDirs
FOR /F "tokens=3,4*" %%A IN ('DIR %1 /A-D /-C ^| FIND /I "File(s)"') DO SET ListDir=%%A
:: Different procedures depending on /L switch
IF "%LongFormat%"=="1" GOTO LongFormat
SET /A ListDir=%ListDir%+524288
SET /A ListDir=%ListDir%/1048576
ECHO..\ %ListDir%
SET ListDir=
GOTO SubDirs
:LongFormat
:: Strip last 6 digits from value
SET ListDir=%ListDir:~0,-6%
IF NOT DEFINED ListDir SET ListDir=0
:: Display resulting value in MB
ECHO..\ %ListDir%
:: Clear variable
SET ListDir=
:: Display disk usage for every subdirectory
:SubDirs
FOR /D %%A IN (*.*) DO CALL :List%LongFormat% "%%~A"
:: Done
POPD
GOTO End
:List0
:: Set variable value to bytes used by directory
FOR /F "tokens=3,4*" %%B IN ('DIR /A /-C /S "%~1" ^| FIND /I "File(s)"') DO SET ListDir=%%~B
:: Add 0.5MB in order to properly round the value when integer divided by 1MB
SET /A ListDir=%ListDir%+524288
:: Integer divide by 1MB
SET /A ListDir=%ListDir%/1048576
:: Display resulting value in MB
ECHO.%~1 %ListDir%
:: Clear variable
SET ListDir=
GOTO:EOF
:List1
:: Set variable value to bytes used by directory
FOR /F "tokens=3,4*" %%B IN ('DIR /A /-C /S "%~1" ^| FIND /I "File(s)"') DO SET ListDir=%%~B
:: Strip last 6 digits from value
SET ListDir=%ListDir:~0,-6%
IF NOT DEFINED ListDir SET ListDir=0
:: Display resulting value in MB
ECHO.%~1 %ListDir%
:: Clear variable
SET ListDir=
GOTO:EOF
:Syntax
ECHO.
ECHO DiskUse, Version 5.01 for Windows 2000 / XP
ECHO Display disk space used by subdirectories (tab delimited)
ECHO.
ECHO Usage: DISKUSE [ startdir ] [ /L ]
ECHO.
ECHO Where: "startdir" is the directory containing subdirectories to be
ECHO inventoried (default is the current directory)
ECHO "/L" is used for large numbers, over 2GB, to prevent return
ECHO of negative numbers due to batch math limitations
ECHO (integer division by 1000000 instead of properly
ECHO rounded mathematical division by 1048576)
ECHO.
ECHO Written by Rob van der Woude
ECHO http://www.robvanderwoude.com
:End
IF "%OS%"=="Windows_NT" ENDLOCAL
=============================================
Resultado:
Directory Space used (MB)
========= ===============
.\ 0
Dir1 8893
Dir2 29
Dir3 29
Dir4 1334
Dir5 1333
Dir6 1463
Dir7 1326
Dir8 1546
Link Original:
http://www.tomshardware.co.uk/forum/117561-35-folder-size-command
Script Original:
http://www.robvanderwoude.com/files/diskuse_2k.txt
diskusage.bat [dir] /L > ficOutput.txt
=============================================
@ECHO OFF
CLS
:: Use local variables
IF "%OS%"=="Windows_NT" SETLOCAL
:: Check command line arguments and Windows version
ECHO.%1 | FIND "/" >NUL
IF NOT ERRORLEVEL 1 IF /I NOT "%~1"=="/L" GOTO Syntax
ECHO.%1 | FIND "?" >NUL
IF NOT ERRORLEVEL 1 GOTO Syntax
ECHO.%1 | FIND "*" >NUL
IF NOT ERRORLEVEL 1 GOTO Syntax
IF NOT "%OS%"=="Windows_NT" GOTO Syntax
IF NOT "%~1"=="" IF /I NOT "%~1"=="/L" IF NOT EXIST "%~1" GOTO Syntax
SET LongFormat=1
IF /I NOT "%~1"=="/L" IF /I NOT "%~2"=="/L" SET LongFormat=0
:: Go to start directory
SET StartDir=%CD%
IF NOT "%~1"=="" IF /I NOT "%~1"=="/L" SET StartDir=%~1
PUSHD "%StartDir%"
IF ERRORLEVEL 1 GOTO Syntax
:: Display header
ECHO Directory Space used (MB)
ECHO.========= ===============
:: Display disk usage for start directory
IF NOT EXIST *.* GOTO SubDirs
FOR /F "tokens=3,4*" %%A IN ('DIR %1 /A-D /-C ^| FIND /I "File(s)"') DO SET ListDir=%%A
:: Different procedures depending on /L switch
IF "%LongFormat%"=="1" GOTO LongFormat
SET /A ListDir=%ListDir%+524288
SET /A ListDir=%ListDir%/1048576
ECHO..\ %ListDir%
SET ListDir=
GOTO SubDirs
:LongFormat
:: Strip last 6 digits from value
SET ListDir=%ListDir:~0,-6%
IF NOT DEFINED ListDir SET ListDir=0
:: Display resulting value in MB
ECHO..\ %ListDir%
:: Clear variable
SET ListDir=
:: Display disk usage for every subdirectory
:SubDirs
FOR /D %%A IN (*.*) DO CALL :List%LongFormat% "%%~A"
:: Done
POPD
GOTO End
:List0
:: Set variable value to bytes used by directory
FOR /F "tokens=3,4*" %%B IN ('DIR /A /-C /S "%~1" ^| FIND /I "File(s)"') DO SET ListDir=%%~B
:: Add 0.5MB in order to properly round the value when integer divided by 1MB
SET /A ListDir=%ListDir%+524288
:: Integer divide by 1MB
SET /A ListDir=%ListDir%/1048576
:: Display resulting value in MB
ECHO.%~1 %ListDir%
:: Clear variable
SET ListDir=
GOTO:EOF
:List1
:: Set variable value to bytes used by directory
FOR /F "tokens=3,4*" %%B IN ('DIR /A /-C /S "%~1" ^| FIND /I "File(s)"') DO SET ListDir=%%~B
:: Strip last 6 digits from value
SET ListDir=%ListDir:~0,-6%
IF NOT DEFINED ListDir SET ListDir=0
:: Display resulting value in MB
ECHO.%~1 %ListDir%
:: Clear variable
SET ListDir=
GOTO:EOF
:Syntax
ECHO.
ECHO DiskUse, Version 5.01 for Windows 2000 / XP
ECHO Display disk space used by subdirectories (tab delimited)
ECHO.
ECHO Usage: DISKUSE [ startdir ] [ /L ]
ECHO.
ECHO Where: "startdir" is the directory containing subdirectories to be
ECHO inventoried (default is the current directory)
ECHO "/L" is used for large numbers, over 2GB, to prevent return
ECHO of negative numbers due to batch math limitations
ECHO (integer division by 1000000 instead of properly
ECHO rounded mathematical division by 1048576)
ECHO.
ECHO Written by Rob van der Woude
ECHO http://www.robvanderwoude.com
:End
IF "%OS%"=="Windows_NT" ENDLOCAL
=============================================
Resultado:
Directory Space used (MB)
========= ===============
.\ 0
Dir1 8893
Dir2 29
Dir3 29
Dir4 1334
Dir5 1333
Dir6 1463
Dir7 1326
Dir8 1546
Link Original:
http://www.tomshardware.co.uk/forum/117561-35-folder-size-command
Script Original:
http://www.robvanderwoude.com/files/diskuse_2k.txt
sexta-feira, outubro 12, 2012
Pesquisa Runtime no History em Linux
Para quem não sabe, se quando estão na linha de comando
fizerem CTRL+R aparece a prompt abaixo e podem iniciar a escrever e acabarão
por conseguir navegar no histórico de comandos, apanhando o último que fizeram
e basta fazer enter assim que ele reconhecer o comando.
~]$
(reverse-i-search)`cd lo': cd log/
terça-feira, outubro 09, 2012
WGET command
If you have a proxy you will need to start with this command
$ export http_proxy=http://100.1.1.1:8080
And then you just need to call the site
$ wget -O- http://www.google.pt
The result should be like this - getting all the HTML page:
--12:37:20-- http://www.google.pt/
Connecting to 100.1.1.1:8080... connected.
Proxy request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `STDOUT'
[<=> ] 0 --.-K/s <html itemscope="itemscope" itemtype="http://schema.org/WebPage"><head><title>Google</title><script>window.google={kEI:"cAx0UKm4DvCX0QWshYGYBA",getEI:function(a){var b;while(a&&!(a.getAttribute&&(b=a.getAttribute("eid"))))a=a.parentNode;return b||google.kEI},https:function(){return window.location.protocol=="https:"},kEXPI:"17259,23628,32690,35704,39523,39978,4000116,4000354,4000553,4000624,4000648,4000743,4000833,4000955,4001001,4001013,4001064,4001132,4001145,4001188,4001192,4001267,4001293,4001441,4001449,4001461",kCSI:{e:"17259,23628,32690,35704,39523,39978,4000116,4000354,4000553,4000624,4000648,4000743,4000833,4000955,4001001,4001013,4001064,4001132,4001145,4001188,4001192,4001267,4001293,4001441,4001449,4001461",ei:"cAx0UKm4DvCX0QWshYGYBA"},authuser:0,
ml:function(){},kHL:"pt-PT",time:function(){return(new Date).getTime()},log:function(a,b,c,e){var d=new Image,h=google,i=h.lc,f=h.li,j="";d.onerror=(d.onload=(d.onabort=function(){delete i[f]}));i<
...
(google.stt!==undefined)google.kCSI.stt=google.stt;google.csiReport&&google.csiReport()}if(window.addEventListener)window.addEventListener("load",
l,false);else if(window.attachEvent)window.attachEvent("onload",l);google.timers.load.t.prt=(f=(new Date).getTime());
})();
[ <=> ] 11,527 --.-K/s in 0.001s
12:37:20 (20.6 MB/s) - `-' saved [11527]
</script></body></html>
$ export http_proxy=http://100.1.1.1:8080
And then you just need to call the site
$ wget -O- http://www.google.pt
The result should be like this - getting all the HTML page:
--12:37:20-- http://www.google.pt/
Connecting to 100.1.1.1:8080... connected.
Proxy request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `STDOUT'
[<=> ] 0 --.-K/s <html itemscope="itemscope" itemtype="http://schema.org/WebPage"><head><title>Google</title><script>window.google={kEI:"cAx0UKm4DvCX0QWshYGYBA",getEI:function(a){var b;while(a&&!(a.getAttribute&&(b=a.getAttribute("eid"))))a=a.parentNode;return b||google.kEI},https:function(){return window.location.protocol=="https:"},kEXPI:"17259,23628,32690,35704,39523,39978,4000116,4000354,4000553,4000624,4000648,4000743,4000833,4000955,4001001,4001013,4001064,4001132,4001145,4001188,4001192,4001267,4001293,4001441,4001449,4001461",kCSI:{e:"17259,23628,32690,35704,39523,39978,4000116,4000354,4000553,4000624,4000648,4000743,4000833,4000955,4001001,4001013,4001064,4001132,4001145,4001188,4001192,4001267,4001293,4001441,4001449,4001461",ei:"cAx0UKm4DvCX0QWshYGYBA"},authuser:0,
ml:function(){},kHL:"pt-PT",time:function(){return(new Date).getTime()},log:function(a,b,c,e){var d=new Image,h=google,i=h.lc,f=h.li,j="";d.onerror=(d.onload=(d.onabort=function(){delete i[f]}));i<
...
(google.stt!==undefined)google.kCSI.stt=google.stt;google.csiReport&&google.csiReport()}if(window.addEventListener)window.addEventListener("load",
l,false);else if(window.attachEvent)window.attachEvent("onload",l);google.timers.load.t.prt=(f=(new Date).getTime());
})();
[ <=> ] 11,527 --.-K/s in 0.001s
12:37:20 (20.6 MB/s) - `-' saved [11527]
</script></body></html>
segunda-feira, outubro 08, 2012
How to Add Background Music to Your Web Page
The best I prefere
For Opera and all IE versions, the following code works:
As you probably have noticed, IE 3.0 and above support both methods, so you cannot simply put both those tags into your web document in the hope of supporting all browsers. It will work on Netscape and early versions of IE, but the newer versions of IE will recognize both tags, leading to problems when IE tries to load the music file twice.
The workaround that I've seen on some sites, that seems to work for me, is to enclose the BGSOUND tag inside NOEMBED tags, thus preventing IE from interpreting the second tag.
This code appears to be compatible with all versions of IE, Netscape and Opera.
How to Add Background Music to Your Web Page
terça-feira, outubro 02, 2012
FINDSTR to get smaller log files
Quando os logs estão a ficar muito grandes e quiserem fazer uma pesquisa
dentro das máquinas windows podem e devem usar
o comando abaixo:
findstr /s /i /C:"[texto a pesquisa]" "*.*" > "[Nome do Ficheiro onde ficam os logs]"
findstr /s /i /C:"[texto a pesquisa]" "*.*" > "[Nome do Ficheiro onde ficam os logs]"
domingo, setembro 30, 2012
Where to download Microsoft Visual Studio 2008 Express Edition
The Microsoft Visual Studio 2008 Express Editions are a free set of tools that are simple, fun and easy to learn. Continuing the Microsoft tradition of enabling developers of all skill levels, this latest release enables hobbyists, students, experienced and casual developers alike to create cool, fun applications.
http://www.microsoft.com/en-us/download/details.aspx?id=6506
http://www.microsoft.com/en-us/download/details.aspx?id=6506
Installing IIS 7 on Windows Vista and Windows 7
Complet article in:
http://www.iis.net/learn/install/installing-iis-7/installing-iis-on-windows-vista-and-windows-7
Introduction
You can use the Microsoft® Web Platform Installer (Web PI) to easily install Internet Information Services (IIS), and applications that run on IIS. To learn more about the Web PI, see Learn about and install the Web PI.If you choose to install IIS 7.0 or above manually, you can use this article for guidance.
Before You Begin
Ensure that you have installed one of the editions of Windows Vista or Windows 7 on which IIS 7 and above is supported before you proceed. Not all IIS features are supported on all editions of Windows Vista and Windows 7. Home Basic and Starter editions include only limited basic features of IIS. To see a list of which features are supported on the edition of Windows you are using, see one of the following:- Available Role Services in IIS 7.0 (Windows Vista)
- Available Web Server (IIS) Role Services in IIS 7.5 (Windows 7)
terça-feira, setembro 25, 2012
How Application Pools Work (IIS 6.0
When you run IIS 6.0 in worker process isolation mode, you can separate
different Web applications and Web sites into groups known as application
pools. An application pool is a group of one or more URLs that are
served by a worker process or set of worker processes. Any Web directory or
virtual directory can be assigned to an application pool.
Every application within an application pool shares the same worker process. Because each worker process operates as a separate instance of the worker process executable, W3wp.exe, the worker process that services one application pool is separated from the worker process that services another. Each separate worker process provides a process boundary so that when an application is assigned to one application pool, problems in other application pools do not affect the application. This ensures that if a worker process fails, it does not affect the applications running in other application pools.
Use multiple application pools when you want to help ensure that applications and Web sites are confidential and secure. For example, an enterprise organization might place its human resources Web site and its finance Web site on the same server, but in different application pools. Likewise, an ISP that hosts Web sites and applications for competing companies might run each companys Web services on the same server, but in different application pools. Using different application pools to isolate applications helps prevent one customer from accessing, changing, or using confidential information from another customers site.
In HTTP.sys, an application pool is represented by a request queue, from which the user-mode worker processes that service an application pool collect the requests. Each pool can manage requests for one or more unique Web applications, which you assign to the application pool based on their URLs. Application pools, then, are essentially worker process configurations that service groups of namespaces.
Multiple application pools can operate at the same time. An application, as defined by its URL, can only be served by one application pool at any time. While one application pool is servicing a request, you cannot route the request to another application pool. However, you can assign applications to another application pool while the server is running.
by http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/67e39bd8-317e-4cf6-b675-6431d4425248.mspx?mfr=true
Every application within an application pool shares the same worker process. Because each worker process operates as a separate instance of the worker process executable, W3wp.exe, the worker process that services one application pool is separated from the worker process that services another. Each separate worker process provides a process boundary so that when an application is assigned to one application pool, problems in other application pools do not affect the application. This ensures that if a worker process fails, it does not affect the applications running in other application pools.
Use multiple application pools when you want to help ensure that applications and Web sites are confidential and secure. For example, an enterprise organization might place its human resources Web site and its finance Web site on the same server, but in different application pools. Likewise, an ISP that hosts Web sites and applications for competing companies might run each companys Web services on the same server, but in different application pools. Using different application pools to isolate applications helps prevent one customer from accessing, changing, or using confidential information from another customers site.
In HTTP.sys, an application pool is represented by a request queue, from which the user-mode worker processes that service an application pool collect the requests. Each pool can manage requests for one or more unique Web applications, which you assign to the application pool based on their URLs. Application pools, then, are essentially worker process configurations that service groups of namespaces.
Multiple application pools can operate at the same time. An application, as defined by its URL, can only be served by one application pool at any time. While one application pool is servicing a request, you cannot route the request to another application pool. However, you can assign applications to another application pool while the server is running.
by http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/67e39bd8-317e-4cf6-b675-6431d4425248.mspx?mfr=true
What is Application Pool in IIS ?
Before Giving the Definition : you can say like this, Concept of Application pool has from IIS 6.0 .
Application pools are used to separate sets of IIS worker processes that share the same configuration and application boundaries. Application pools used to isolate our web application for better security, reliability, and availability and performance and keep running with out impacting each other . The worker process serves as the process boundary that separates each application pool so that when one worker process or application is having an issue or recycles, other applications or worker processes are not affected.
One Application Pool can have multiple worker process Also.
Main Point to Remember:
1. Isolation of Different Web Application
2. Individual worker process for different web application
3. More reliably web application
4. Better Performance
by http://www.dotnetfunda.com/interview/exam1414-what-is-application-pool-in-iis.aspx
Application pools are used to separate sets of IIS worker processes that share the same configuration and application boundaries. Application pools used to isolate our web application for better security, reliability, and availability and performance and keep running with out impacting each other . The worker process serves as the process boundary that separates each application pool so that when one worker process or application is having an issue or recycles, other applications or worker processes are not affected.
One Application Pool can have multiple worker process Also.
Main Point to Remember:
1. Isolation of Different Web Application
2. Individual worker process for different web application
3. More reliably web application
4. Better Performance
by http://www.dotnetfunda.com/interview/exam1414-what-is-application-pool-in-iis.aspx
segunda-feira, setembro 17, 2012
Pesquisa em ficheiros do fim para o ínicio
Sempre que preciso de pesquisar num ficheiro com demasiadas linhas tenho sempre o problema de encontrar o que quero.
O problema é que o grep obriga a saber o que pesquisar - e por vezes não se sabe.
Para ver um ficheiro do fim para o final:
less +G extractor.log
quinta-feira, setembro 13, 2012
Procurar ficheiros numa máquina recursivamente
Para proceder a pesquisa de ficheiros recursivamente:
find . -name '*.tar'
find . -name '*.tar'
sexta-feira, agosto 31, 2012
Obter o espaço ocupado e livre de uma drive em Linux,
Para obter o espaço ocupado e livre de uma drive em Linux, usar o comando
df -h
Irão ter uma listagem como a seguinte:
Filesystem Size Used Avail Use% Mounted on
/dev/1234 19G 2.4G 16G 14% /
/dev/421 110G 59G 46G 57% /usersdir
/dev/2255 190M 17M 164M 10% /boot
tmpfs 1005M 0 1005M 0% /dev/shm
df -h
Irão ter uma listagem como a seguinte:
Filesystem Size Used Avail Use% Mounted on
/dev/1234 19G 2.4G 16G 14% /
/dev/421 110G 59G 46G 57% /usersdir
/dev/2255 190M 17M 164M 10% /boot
tmpfs 1005M 0 1005M 0% /dev/shm
Disk Usage por diretorias em Linux
Para identificar o espaço usado num grupo de diretorias o comando é o seguinte:
du -hs /UsersHomeDir/*
Isto irá retornar uma listagem deste tipo:
1.9G /UsersHomeDir/User1
117M /UsersHomeDir/User2
3.7G /UsersHomeDir/User3
28M /UsersHomeDir/User4
23M /UsersHomeDir/User5
3.3G /UsersHomeDir/User6
601M /UsersHomeDir/User7
109M /UsersHomeDir/User8
85M /UsersHomeDir/User9
16K /UsersHomeDir/User10
2.7G /UsersHomeDir/User11
412M /UsersHomeDir/User12
21M /UsersHomeDir/User13
22M /UsersHomeDir/User14
45G /UsersHomeDir/User15
1.3G /UsersHomeDir/User16
Com estes dados conseguimos identificar qual o user que esteja a ultrapassar o espaço esperado. Existem outras possibilidades como limitar (na criação do user) o espaço disponível, mas assim também podemos criar impacto em users que não temos garantias do crescimento esperado.
du -hs /UsersHomeDir/*
Isto irá retornar uma listagem deste tipo:
1.9G /UsersHomeDir/User1
117M /UsersHomeDir/User2
3.7G /UsersHomeDir/User3
28M /UsersHomeDir/User4
23M /UsersHomeDir/User5
3.3G /UsersHomeDir/User6
601M /UsersHomeDir/User7
109M /UsersHomeDir/User8
85M /UsersHomeDir/User9
16K /UsersHomeDir/User10
2.7G /UsersHomeDir/User11
412M /UsersHomeDir/User12
21M /UsersHomeDir/User13
22M /UsersHomeDir/User14
45G /UsersHomeDir/User15
1.3G /UsersHomeDir/User16
Com estes dados conseguimos identificar qual o user que esteja a ultrapassar o espaço esperado. Existem outras possibilidades como limitar (na criação do user) o espaço disponível, mas assim também podemos criar impacto em users que não temos garantias do crescimento esperado.
segunda-feira, agosto 27, 2012
terça-feira, julho 10, 2012
LINUX: Exemplo de script com repetição
process () {
echo "Hi, I'm sleeping for 5 seconds..."
sleep 5
echo "all Done."
echo "repeat again"
process
}
process
How do I pause for 5 seconds or 2 minutes in my bash shell script?
You need to use the sleep command to add delay for a specified amount of time. The syntax is as follows:
sleep NUMBER[SUFFIX]
Where SUFFIX may be:
1.s for seconds (the default)
2.m for minutes.
3.h for hours.
4.d for days.
To sleep for 5 seconds, use:
sleep 5
To sleep for 2 mintus, use:
sleep 2m
sleep Command Bash Script Example
#!/bin/bash
echo "Hi, I'm sleeping for 5 seconds..."
sleep 5
echo "all Done."
sleep NUMBER[SUFFIX]
Where SUFFIX may be:
1.s for seconds (the default)
2.m for minutes.
3.h for hours.
4.d for days.
To sleep for 5 seconds, use:
sleep 5
To sleep for 2 mintus, use:
sleep 2m
sleep Command Bash Script Example
#!/bin/bash
echo "Hi, I'm sleeping for 5 seconds..."
sleep 5
echo "all Done."
terça-feira, junho 05, 2012
Definições do CRONTAB
Crontab
Origem: Wikipédia, a enciclopédia livre.
crontab é um programa do Unix que edita o arquivo onde são especificados os comandos a serem executados e a hora e dia de execução pelo cron, um programa que executa comandos agendados nos sistemas operacionais do tipo Unix (como o Linux ou o MINIX, por exemplo). O cron
se encarregará de verificar a hora e determinar se existe ou não algum
programa a ser rodado. Caso exista ele o rodará na hora e data
solicitada.- Parâmetros
-l usuário -- lista as tarefas agendadas para o usuário -e usuário -- edita o agendador -r usuário -- apaga o arquivo do usuário -c diretório -- especifica um diretório para o CrontabSintaxe:
Dentro do arquivo que se abre após o comando existe uma sintaxe conforme a seguir:
mm hh dd MM ss scriptonde
mm = minuto(0-59) hh = hora(0-23) dd = dia(1-31) MM = mes(1-12) ss = dia_da_semana(0-6) script = comando a ser executado. Obs 1: Em dia_da_Semana, 0 refere-se a domingo; e 6, ao sábado. No caso de dia da semana funciona também as três primeiras letras (em inglês) do dia da semana (SUN,MON,TUE,WED,THU,FRI,SAT) Obs 2: Em qualquer posição pode-se usar o * (asterisco) quando não se importar com o campo em questão. Obs 3: Pode-se utilizar intervalos nesses campos. O caracter para intervalo é o - (hifen). Obs 4: Pode-se utilizar lista de valores nesses campos. O caracter para a lista é a , (vírgula). Obs 5: Qualquer texto colocado após o programa que será executado será considerado comentário e não será interpretado pelo cronSeguem alguns exemplos:
Todo dia de hora em hora (hora cheia) 00 * * * * /bin/script De cinco em cinco minutos todos os dias (note a divisão por 5 do intervalo 00-59) 00-59/5 * * * * /bin/script Nas seguintes horas: 10, 12, 16, 18, 22 aos 15 minutos da hora 15 10,12,16,18,22 * * * /bin/script Nos primeiros cinco dias do mês às 19:25 25 19 01-05 * * /bin/script De segunda a sexta ao meio-dia e a meia-noite 00 00,12 * * 1-5 /bin/script Script rodar Segunda,Quarta,Sexta às 2 horas 0 2 * * mon,wed,fri /bin/script Script para rodar Terça,Quinta às 3 horas 0 3 * * tue,thu /bin/script Script para ser executado minuto a minuto */1 * * * * /bin/script
sexta-feira, maio 25, 2012
TAR Multiple Files into a GZIP FIle
If you want to
tar
your files together and gzip
the resulting tar file. tar cfvz cvd.tar.gz cvd*.txt
To untar the gzip'd tar file you would do: tar xfvz cvd.tar.gz -C /path/to/parent/dir
This would extract your files under the /path/to/parent/dir
directory
segunda-feira, fevereiro 27, 2012
UNIX: Como obter um grupo de linhas antes e depois de um grep
Comandos Unix/Linux
====================
Como obter um grupo de linhas antes e depois de um grep
-> grep -A 10 -B 10 [texto a pesquisar] [ficheiro a pesquisar]
====================
Como obter um grupo de linhas antes e depois de um grep
-> grep -A 10 -B 10 [texto a pesquisar] [ficheiro a pesquisar]
sábado, fevereiro 18, 2012
SCOM Training Videos
A monitorização em Windows integrada nunca foi fácil.
Ou os fornecedores não o criavam, ou os sistemas só faziam logging...
E ao príncipio a monitorização integrada limitava-se a um tipo de sistema.
Isso modou com o CACTI, NAGIOS e só a Microsoft não andava muito preocupada com isso e mantinha o seu MOM dentro do círculo do windows e arredores.
Desde que a visão mudou na Microsoft à uns anos, agora com o SCOM, sim senhor... Já vi uns quantos vídeos... Estou mortinho por lhe meter as mãos :)
Até monitorizar BDs de vários suppliers... Linux, etc... Irei ter num projeto essa oportunidade para o experimentar... Vamos lá :)
http://www.microsoft.com/download/en/confirmation.aspx?id=8562
http://www.microsoft.com/en-us/server-cloud/system-center/operations-manager.aspx
http://www.youtube.com/watch?v=5wqI7tbhzxI
Ou os fornecedores não o criavam, ou os sistemas só faziam logging...
E ao príncipio a monitorização integrada limitava-se a um tipo de sistema.
Isso modou com o CACTI, NAGIOS e só a Microsoft não andava muito preocupada com isso e mantinha o seu MOM dentro do círculo do windows e arredores.
Desde que a visão mudou na Microsoft à uns anos, agora com o SCOM, sim senhor... Já vi uns quantos vídeos... Estou mortinho por lhe meter as mãos :)
Até monitorizar BDs de vários suppliers... Linux, etc... Irei ter num projeto essa oportunidade para o experimentar... Vamos lá :)
http://www.microsoft.com/download/en/confirmation.aspx?id=8562
http://www.microsoft.com/en-us/server-cloud/system-center/operations-manager.aspx
http://www.youtube.com/watch?v=5wqI7tbhzxI
Subscrever:
Mensagens (Atom)