2013-05-12 28 views
3

我的問題是,ping不能告訴我網站是啓動還是關閉。我需要知道網站是否啓動或停止。如果它沒有重啓腳本。如果任何人都可以幫助我,那將是驚人的。批處理腳本來判斷一個網站是啓動還是關閉

我能想到的唯一方法就是抓住網站的內容,看看它是否正常。

我想用Apatche Tomcat在服務器上運行它。

@echo off 


:first 
PING -n 5 google.com | FIND "TTL" > nul 
IF %errorlevel% == 0 (
echo Website is up. 
goto :first 
) ELSE (
echo Website is down. Restarting service 
goto :second 
echo restart 
echo ping 
) 


:: This calls the second loop 
:second 

:: This will stop the service 
net stop TapiSrv 

ping -n 10 127.0.0.1 

:: This will start the service 
net start TapiSrv 

:: This check to see if the website is up 
GOTO :first 
+0

根據您定義的「網站是否正常」,您應該考慮使用類似'curl'或'wget'的方法來完成此操作。另外,請考慮使用PowerShell。 – Mat 2013-05-12 19:01:59

+0

檢查BITSADMIN - 這是一個用於下載的本地Windows命令。但是您可以閱讀站點的響應。 – npocmaka 2013-05-12 20:42:52

+0

如果可以,我不想添加第三方軟件。我有一個PHP方法。但我無法弄清楚如何讓PHP和批處理工作.. – 2013-05-12 20:49:37

回答

0

你可以在Web應用程序中servlet或「PHP」,這可以打印簡單的文本,如「運行」或「當前時間」 ......你可以打這個URL與HttpClient的(JAVA)的幫助下或其他方面,並檢查其是否合適。

1

嘗試wget程序從unixutilsGnuWin32,例如:

wget --timeout=5 --tries=1 --quiet --spider http://google.com >nul 2>&1 && echo site is up || echo site is down 

wget的選項&參數:

 
GNU Wget 1.8.2, a non-interactive network retriever. 
Usage: wget [OPTION]... [URL]... 

Mandatory arguments to long options are mandatory for short options too. 

Startup: 
    -V, --version   display the version of Wget and exit. 
    -h, --help    print this help. 
    -b, --background  go to background after startup. 
    -e, --execute=COMMAND execute a `.wgetrc'-style command. 

Logging and input file: 
    -o, --output-file=FILE  log messages to FILE. 
    -a, --append-output=FILE append messages to FILE. 
    -d, --debug    print debug output. 
    -q, --quiet    quiet (no output). 
    -v, --verbose    be verbose (this is the default). 
    -nv, --non-verbose   turn off verboseness, without being quiet. 
    -i, --input-file=FILE  download URLs found in FILE. 
    -F, --force-html   treat input file as HTML. 
    -B, --base=URL    prepends URL to relative links in -F -i file. 
     --sslcertfile=FILE  optional client certificate. 
     --sslcertkey=KEYFILE optional keyfile for this certificate. 
     --egd-file=FILE  file name of the EGD socket. 

Download: 
     --bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host. 
    -t, --tries=NUMBER   set number of retries to NUMBER (0 unlimits). 
    -O --output-document=FILE write documents to FILE. 
    -nc, --no-clobber    don't clobber existing files or use .# suffixes. 
    -c, --continue    resume getting a partially-downloaded file. 
     --progress=TYPE   select progress gauge type. 
    -N, --timestamping   don't re-retrieve files unless newer than local. 
    -S, --server-response  print server response. 
     --spider     don't download anything. 
    -T, --timeout=SECONDS  set the read timeout to SECONDS. 
    -w, --wait=SECONDS   wait SECONDS between retrievals. 
     --waitretry=SECONDS  wait 1...SECONDS between retries of a retrieval. 
     --random-wait   wait from 0...2*WAIT secs between retrievals. 
    -Y, --proxy=on/off   turn proxy on or off. 
    -Q, --quota=NUMBER   set retrieval quota to NUMBER. 
     --limit-rate=RATE  limit download rate to RATE. 

Directories: 
    -nd --no-directories   don't create directories. 
    -x, --force-directories   force creation of directories. 
    -nH, --no-host-directories  don't create host directories. 
    -P, --directory-prefix=PREFIX save files to PREFIX/... 
     --cut-dirs=NUMBER   ignore NUMBER remote directory components. 

HTTP options: 
     --http-user=USER  set http user to USER. 
     --http-passwd=PASS set http password to PASS. 
    -C, --cache=on/off  (dis)allow server-cached data (normally allowed). 
    -E, --html-extension  save all text/html documents with .html extension. 
     --ignore-length  ignore `Content-Length' header field. 
     --header=STRING  insert STRING among the headers. 
     --proxy-user=USER  set USER as proxy username. 
     --proxy-passwd=PASS set PASS as proxy password. 
     --referer=URL   include `Referer: URL' header in HTTP request. 
    -s, --save-headers  save the HTTP headers to file. 
    -U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION. 
     --no-http-keep-alive disable HTTP keep-alive (persistent connections). 
     --cookies=off   don't use cookies. 
     --load-cookies=FILE load cookies from FILE before session. 
     --save-cookies=FILE save cookies to FILE after session. 

FTP options: 
    -nr, --dont-remove-listing don't remove `.listing' files. 
    -g, --glob=on/off   turn file name globbing on or off. 
     --passive-ftp   use the "passive" transfer mode. 
     --retr-symlinks   when recursing, get linked-to files (not dirs). 

Recursive retrieval: 
    -r, --recursive   recursive web-suck -- use with care! 
    -l, --level=NUMBER  maximum recursion depth (inf or 0 for infinite). 
     --delete-after  delete files locally after downloading them. 
    -k, --convert-links  convert non-relative links to relative. 
    -K, --backup-converted before converting file X, back up as X.orig. 
    -m, --mirror    shortcut option equivalent to -r -N -l inf -nr. 
    -p, --page-requisites get all images, etc. needed to display HTML page. 

Recursive accept/reject: 
    -A, --accept=LIST    comma-separated list of accepted extensions. 
    -R, --reject=LIST    comma-separated list of rejected extensions. 
    -D, --domains=LIST    comma-separated list of accepted domains. 
     --exclude-domains=LIST  comma-separated list of rejected domains. 
     --follow-ftp     follow FTP links from HTML documents. 
     --follow-tags=LIST   comma-separated list of followed HTML tags. 
    -G, --ignore-tags=LIST   comma-separated list of ignored HTML tags. 
    -H, --span-hosts     go to foreign hosts when recursive. 
    -L, --relative     follow relative links only. 
    -I, --include-directories=LIST list of allowed directories. 
    -X, --exclude-directories=LIST list of excluded directories. 
    -np, --no-parent     don't ascend to the parent directory. 
2

我拼湊了一起 - 它在這裏工作在Win 8

請注意,它會告訴你網站是否回覆了一條消息 - 它不會檢查它所服務的頁面是否爲「否」 rmal操作頁面或錯誤消息。

@echo off 
if "%~1"=="" (
echo %0 www.url.com 
echo Checks the status of the URL 
pause 
goto :EOF 
) 


>"%temp%\geturl.vbs" echo Set objArgs = WScript.Arguments 
>>"%temp%\geturl.vbs" echo url = objArgs(0) 
>>"%temp%\geturl.vbs" echo pix = objArgs(1) 
>>"%temp%\geturl.vbs" echo With CreateObject("MSXML2.XMLHTTP") 
>>"%temp%\geturl.vbs" echo .open "GET", url, False 
>>"%temp%\geturl.vbs" echo .send 
>>"%temp%\geturl.vbs" echo a = .ResponseBody 
>>"%temp%\geturl.vbs" echo End With 
>>"%temp%\geturl.vbs" echo With CreateObject("ADODB.Stream") 
>>"%temp%\geturl.vbs" echo .Type = 1 'adTypeBinary 
>>"%temp%\geturl.vbs" echo .Mode = 3 'adModeReadWrite 
>>"%temp%\geturl.vbs" echo .Open 
>>"%temp%\geturl.vbs" echo .Write a 
>>"%temp%\geturl.vbs" echo .SaveToFile pix, 2 'adSaveCreateOverwrite 
>>"%temp%\geturl.vbs" echo .Close 
>>"%temp%\geturl.vbs" echo End With 


cscript /nologo "%temp%\geturl.vbs" http://%1 url.htm 2>nul 
if not exist url.htm (
echo site is down or access is denied 
) else (
for %%a in (url.htm) do if %%~za GTR 0 echo site is up 
del url.htm 
) 
del "%temp%\geturl.vbs" 
pause 
+0

這個腳本在哪裏geturl.vbs – 2013-05-14 03:35:27

+0

它是由批處理文件創建的。空行之間的部分。 – foxidrive 2013-05-14 15:24:45

相關問題