2017-04-25 45 views
0

運行Nikto工具來檢查PC。無法使用魷魚代理打開文件

- Nikto v2.1.6 
--------------------------------------------------------------------------- 
+ Target IP:   10.xx.xx.xx 
+ Target Hostname: 10.xx.xx.xx 
+ Target Port:  8028 
+ Start Time:   2017-04-25 04:46:05 (GMT-4) 
--------------------------------------------------------------------------- 
+ Server: squid/4.0.17 
+ Retrieved via header: 1.1 localhost.localdomain (squid/4.0.17) 
+ The anti-clickjacking X-Frame-Options header is not present. 
+ The X-XSS-Protection header is not defined. This header can hint to the user agent to protect against some forms of XSS 
+ Uncommon header 'x-cache-lookup' found, with contents: NONE from localhost.localdomain:8028 
+ Uncommon header 'x-cache' found, with contents: MISS from localhost.localdomain 
+ Uncommon header 'x-squid-error' found, with contents: ERR_INVALID_URL 0 
+ The X-Content-Type-Options header is not set. This could allow the user agent to render the content of the site in a different fashion to the MIME type 
+ No CGI Directories found (use '-C all' to force check all possible dirs) 
+ Entry '<li><p>Illegal character in hostname; underscores are not ed.</p></li>' in robots.txt returned a non-forbidden or redirect HTTP code (400) 
+ "robots.txt" contains 1 entry which should be manually viewed. 

查找在端口8028 squide 4.0.17,並找到robots.txt文件

但是,如果我嘗試在URL中打開它收到一個錯誤: http://10.xx.xx.xx:8028/robots.txt

ERROR 

The requested URL could not be retrieved 

我如何看到這個文件?

回答

1

您可以使用nikto代理參數。例如nikto -h http://example.com/ -useproxy 10.xx.xx.xx:8128。 通過這種方式,它將使用正在使用的代理進行掃描,並將找到robots.txt。你也可以在你的瀏覽器中設置代理地址,然後你可以在url中打開它。在Firefox瀏覽器中設置代理地址;首選項 - >高級 - >網絡 - >設置 - >手動代理配置