How can I track tweets from a particular city and store in a MongoDB via python? -
i write piece of code grabs tweets particular location (e.g. city) , puts them in mongodb via python. complete newcomer programming, have managed track particular hashtags twitter , store them in mongodb using following piece of code:
01 import pycurl, json 02 import pymongo 03 04 stream_url = "https://stream.twitter.com/1/statuses/filter.json" 05 words = "track=#occupywallstreet" 06 user = "myuser" 07 pass = "mypass" 08 09 def on_tweet(data): 10 try: 11 tweet = json.loads(data) 12 db.posts.insert(tweet) 13 print tweet 14 except: 15 return 16 17 pymongo import connection 18 connection = connection() 19 db = connection.occupywallstreet 20 conn = pycurl.curl() 21 conn.setopt(pycurl.post, 1) 22 conn.setopt(pycurl.postfields, words) 23 conn.setopt(pycurl.httpheader, ["connection: keep-alive", "keep-alive: 3000"]) 24 conn.setopt(pycurl.userpwd, "%s:%s" % (user, pass)) 25 conn.setopt(pycurl.url, stream_url) 26 conn.setopt(pycurl.writefunction, on_tweet) 27 conn.perform()
how can track geolocated tweets, i.e. tweets particular city? there way may alter above code suit needs?
thanks!
you should use locations parameter in case:
import pycurl import json stream_url = "https://stream.twitter.com/1/statuses/filter.json" locations = "locations=-74,40,-73,41" # new york user = "myuser" pass = "mypass" def on_tweet(data): try: tweet = json.loads(data) db.posts.insert(tweet) print tweet except: return pymongo import connection connection = connection() db = connection.occupywallstreet conn = pycurl.curl() conn.setopt(pycurl.post, 1) conn.setopt(pycurl.postfields, locations) conn.setopt(pycurl.httpheader, ["connection: keep-alive", "keep-alive: 3000"]) conn.setopt(pycurl.userpwd, "%s:%s" % (user, pass)) conn.setopt(pycurl.url, stream_url) conn.setopt(pycurl.writefunction, on_tweet) conn.perform()
hope helps.
Comments
Post a Comment