爱美女网爬虫[预览版] [23.06.01] [Windows] — 儿童节快乐

前一篇文章imn5的爬虫,预览版,不支持搜索。支持webp格式图片下载(这个文件格式在其他的网站都没出现过)

本站为全网最新秀人机构写真,新出的作品为预览版一般在2个星期内更新高清完整版。高清版均采用1200高像素大图,且无别家的水印。本站不管手机还是电脑端均无广告。 —图片网站宣传语

Continue Reading

秀人集爬虫 [更新版] 【23.05.13】【Windows】

C:\Users\obaby>F:\Pycharm_Projects\meitulu-spider\dist\xiurenji2\xiurenji2.exe
****************************************************************************************************
秀人集爬虫 [更新版]
Verson: 23.05.13
当前服务器地址:https://www.xiuren5.vip
Blog: http://oba.by
姐姐的上面的域名怎样啊?说不好的不让用!!哼!!
****************************************************************************************************
USAGE:
spider -h <help> -a <all> -q <search>
Arguments:
         -a <download all site images>
         -q <query the image with keywords>
         -h <display help text, just this>
Option Arguments:
         -p <image download path>
         -r &lt;random index category list>
         -c <single category url>
         -e <early stop, work in site crawl mode only>
         -s <site url eg: http://www.xiurenji.vip (no last backslash "/")>
****************************************************************************************************

Continue Reading

精品美女吧 爬虫【Windows】【23.04.16】

精品美女吧 爬虫
Verson: 23.04.16
Blog: http://www.h4ck.org.cn
****************************************************************************************************
USAGE:
spider -h <help> -a <all> -q <search> -e <early stop>
Arguments:
         -a <download all site images>
         -h <display help text, just this>
Option Arguments:
         -p <image download path>
         -r <random index category list>
         -c <single category url>
         -e <early stop, work in site crawl mode only>
         -s <site url eg: https://www.jpxgmn.net (no last backslash "/")>
****************************************************************************************************

Continue Reading

requests SSLCertVerificationError

今天运行精品美女吧爬虫的时候出错了。提示证书错误。
Traceback (most recent call last):
  File "requests\adapters.py", line 439, in send
  File "urllib3\connectionpool.py", line 785, in urlopen
  File "urllib3\util\retry.py", line 592, in increment
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www.jpmn8.cc', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1124)')))

Continue Reading