How to specify parameters on a Request using scrapy

How do I pass parameters to a a request on a url like this:

site.com/search/?action=search&description=My Search here&e_author=

How do I put the arguments on the structure of a Spider Request, something like this exemple:

req = Request(url="site.com/",parameters={x=1,y=2,z=3})


ANSWERS:


Pass your GET parameters inside the URL itself:

return Request(url="https://yoursite.com/search/?action=search&description=MySearchhere&e_author=")

You should probably define your parameters in a dictionary and then "urlencode" it:

from urllib import urlencode

params = { 
    "action": "search",
    "description": "My search here",
    "e_author": ""
}
url = "https://yoursite.com/search/?" + urlencode(params)

return Request(url=url)

Scrapy doesn't offer this directly. What you are trying to do is to create a url, for which you can use the urlparse module



 MORE:


 ? Broad Crawling with scrapy
 ? How to use Rules in Scrapy for following some links?
 ? Error with links in scrapy
 ? Crawling domains serially with Scrapy
 ? How to get scrapy FormRequest work
 ? How to get scrapy FormRequest work
 ? How to get scrapy FormRequest work
 ? How to get the scrapy form submission working
 ? How to compare each Scrapy spider item with another Scrapy spider items?
 ? How to compare each Scrapy spider item with another Scrapy spider items?