Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error using random_proxies #2

Open
CopsLikeDonuts opened this issue Jul 10, 2020 · 2 comments
Open

Error using random_proxies #2

CopsLikeDonuts opened this issue Jul 10, 2020 · 2 comments

Comments

@CopsLikeDonuts
Copy link

Hello, just testing your package and have an error. Could you look throug and fix it?

installed packages:
beautifulsoup4==4.9.1
bs4==0.0.1
certifi==2020.6.20
chardet==3.0.4
idna==2.10
requests==2.24.0
soupsieve==2.0.1
urllib3==1.25.9
random_proxies ofc

my code:
import requests
from bs4 import BeautifulSoup
import os
from time import sleep
import csv
from fake_useragent import UserAgent
import lxml
from random import choice, uniform
from random_proxies import random_proxy

URL = 'http://sitespy.ru/my-ip'

proxy = random_proxy()
random_proxy()

@g0per
Copy link
Contributor

g0per commented Jul 10, 2020

Cache server returns error (success | "no"), so I've been bypassing it with the use_cache=False atribute. Not an ideal solution tho.

Edited your code to show it:

import requests
from random_proxies import random_proxy
theurl = 'http://sitespy.ru/my-ip'
proxy = random_proxy(use_cache=False)
print(proxy)
headers = {'http':proxy}
petition = requests.get(theurl,headers=headers)
print(petition.status_code)

Which returned last time I ran it:

161.35.4.201:80
200

@2knal
Copy link
Owner

2knal commented Jul 11, 2020

Hello, just testing your package and have an error. Could you look throug and fix it?

installed packages:
beautifulsoup4==4.9.1
bs4==0.0.1
certifi==2020.6.20
chardet==3.0.4
idna==2.10
requests==2.24.0
soupsieve==2.0.1
urllib3==1.25.9
random_proxies ofc

my code:
import requests
from bs4 import BeautifulSoup
import os
from time import sleep
import csv
from fake_useragent import UserAgent
import lxml
from random import choice, uniform
from random_proxies import random_proxy

URL = 'http://sitespy.ru/my-ip'

proxy = random_proxy()
random_proxy()

Thanks for reporting the issue 😃!
Most likely the cache server is down and hence the problem.
I will look into it.
Apologies for any inconvenience caused.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants