python-requests have useful params kwargs, you can do
url = 'http://aaa'
params = {
'expand': 'variations,informationBlocks,customisations',
}
response = requests.request(
method='GET',
url=url,
headers=headers,
params=params,
)
and it will resut in http://aaa?expand=variations,informationBlocks,customizations
what do you think about adding params kwarg to scrapy,Request()? It would simplify work, there would be no need to urlencode querystring if it's a dict and concatenate strings for url.
Yes, I'm looking for this attribute either. I used to use params in requests lib, sometime, it is inconvenient and terrible to url encode the <spance>, &, ?, % yourself.
Try to use the FormReqeust in Scrapy instead of the Request.
In your case, code as followed:
return FormRequest(url=url, method='GET', formdata=params, callback=self.callback_func)
Hi, @pawelmhm
@leonardfrank provides a way to do this, and if you want to know more, please read here.
I would like to provide another solution for this, the module coming with python: 21.8. urllib.parse — Parse URLs into components — Python 3.6.5 documentation. In this module there are several functions related to URL:
and many others. They can help you manipulate URL in many ways. Going through this official document will help you to use them in an appropriate way.
Most helpful comment
Try to use the FormReqeust in Scrapy instead of the Request.
In your case, code as followed:
return FormRequest(url=url, method='GET', formdata=params, callback=self.callback_func)