aiohttp is HTTP client/server forpython andAsyncIO. It supports both server websockets and client websockets. As it works asynchronously, it can handle hundreds of requests per seconds providing better performance than other frameworks.
AsyncIO is apython library for writing:
Single-threaded concurrent code using coroutines.
Multiplexing I/O access over sockets and other resources.
Running network clients and servers, and other related primitives.
This provides concurrency especially for I/O bound tasks over sockets and other resources. Concurrency ensures that user does not have wait for the I/O bound results.
In this article, we will create a rest API for our application using aiohttp. It is a simple application which has a Note table.
Set up aiohttp
Activate a virtual environment in python 3 and install aiohttp
pipinstallaiohttp
or clone the github repository and install the requirements
pipinstall-rrequirements.txt
Create Models
We will configure application to use sqlite as our database inmodels.py
# DBSession = scoped_session(sessionmaker(extension=ZopeTransactionExtension()))DB_URI='sqlite:///stuff.db'Session=sessionmaker(autocommit=False,autoflush=False,bind=create_engine(DB_URI))session=scoped_session(Session)Base=declarative_base()
Then we create Note class for note objects inmodels.py
classNote(Base):__tablename__='notes'id=Column(Integer,primary_key=True)title=Column(String(50))description=Column(String(50))created_at=Column(String(50))created_by=Column(String(50))priority=Column(Integer)def__init__(self,title,description,created_at,created_by,priority):self.title=titleself.description=descriptionself.created_at=created_atself.created_by=created_byself.priority=priority@classmethoddeffrom_json(cls,data):returncls(**data)defto_json(self):to_serialize=['id','title','description','created_at','created_by','priority']d={}forattr_nameinto_serialize:d[attr_name]=getattr(self,attr_name)returnd
Resources
We define our API endpoints inaiohttp_rest.py
file.
DEFAULT_METHODS=('GET','POST','PUT','DELETE')classRestEndpoint:def__init__(self):self.methods={}formethod_nameinDEFAULT_METHODS:method=getattr(self,method_name.lower(),None)ifmethod:self.register_method(method_name,method)defregister_method(self,method_name,method):self.methods[method_name.upper()]=methodasyncdefdispatch(self,request:Request):method=self.methods.get(request.method.upper())ifnotmethod:raiseHTTPMethodNotAllowed('',DEFAULT_METHODS)wanted_args=list(inspect.signature(method).parameters.keys())available_args=request.match_info.copy()available_args.update({'request':request})unsatisfied_args=set(wanted_args)-set(available_args.keys())ifunsatisfied_args:# Expected match info that doesn't existraiseHttpBadRequest('')returnawaitmethod(**{arg_name:available_args[arg_name]forarg_nameinwanted_args})classCollectionEndpoint(RestEndpoint):def__init__(self,resource):super().__init__()self.resource=resourceasyncdefget(self)->Response:data=[]notes=session.query(Note).all()forinstanceinself.resource.collection.values():data.append(self.resource.render(instance))data=self.resource.encode(data)returnResponse(status=200,body=self.resource.encode({'notes':[{'id':note.id,'title':note.title,'description':note.description,'created_at':note.created_at,'created_by':note.created_by,'priority':note.priority}fornoteinsession.query(Note)]}),content_type='application/json')asyncdefpost(self,request):data=awaitrequest.json()note=Note(title=data['title'],description=data['description'],created_at=data['created_at'],created_by=data['created_by'],priority=data['priority'])session.add(note)session.commit()returnResponse(status=201,body=self.resource.encode({'notes':[{'id':note.id,'title':note.title,'description':note.description,'created_at':note.created_at,'created_by':note.created_by,'priority':note.priority}fornoteinsession.query(Note)]}),content_type='application/json')classInstanceEndpoint(RestEndpoint):def__init__(self,resource):super().__init__()self.resource=resourceasyncdefget(self,instance_id):instance=session.query(Note).filter(Note.id==instance_id).first()ifnotinstance:returnResponse(status=404,body=json.dumps({'not found':404}),content_type='application/json')data=self.resource.render_and_encode(instance)returnResponse(status=200,body=data,content_type='application/json')asyncdefput(self,request,instance_id):data=awaitrequest.json()note=session.query(Note).filter(Note.id==instance_id).first()note.title=data['title']note.description=data['description']note.created_at=data['created_at']note.created_by=data['created_by']note.priority=data['priority']session.add(note)session.commit()returnResponse(status=201,body=self.resource.render_and_encode(note),content_type='application/json')asyncdefdelete(self,instance_id):note=session.query(Note).filter(Note.id==instance_id).first()ifnotnote:abort(404,message="Note {} doesn't exist".format(id))session.delete(note)session.commit()returnResponse(status=204)classRestResource:def__init__(self,notes,factory,collection,properties,id_field):self.notes=notesself.factory=factoryself.collection=collectionself.properties=propertiesself.id_field=id_fieldself.collection_endpoint=CollectionEndpoint(self)self.instance_endpoint=InstanceEndpoint(self)defregister(self,router:UrlDispatcher):router.add_route('*','/{notes}'.format(notes=self.notes),self.collection_endpoint.dispatch)router.add_route('*','/{notes}/{{instance_id}}'.format(notes=self.notes),self.instance_endpoint.dispatch)defrender(self,instance):returnOrderedDict((notes,getattr(instance,notes))fornotesinself.properties)@staticmethoddefencode(data):returnjson.dumps(data,indent=4).encode('utf-8')defrender_and_encode(self,instance):returnself.encode(self.render(instance))
By using async keyword with all methods (GET, POST, PUT and DELETE), we ensure that those operations are performed asynchronously and the response is returned from from both collection end point and instance end points. After setting up our endpoints, we declare resources inaio-app.py
file.
fromaiohttp.webimportApplication,run_appfromaiohttp_restimportRestResourcefrommodelsimportNotefromsqlalchemyimportengine_from_confignotes={}app=Application()person_resource=RestResource('notes',Note,notes,('title','description','created_at','created_by','priority'),'title')person_resource.register(app.router)if__name__=='__main__':run_app(app)
Running the application
First create the database by:
pythonmodels.py
Run the app by executing following in terminal
pythonaio-app.py
Open python shell and execute some requests
requests.post('http://localhost:8080/notes',data=json.dumps({"title":"note two","created_at":"2017-08-23 00:00","created_by":"apcelent","description":"sample notes","priority":4}))requests.put('http://localhost:8080/notes/1',data=json.dumps({"title":"note edit","created_at":"2017-08-23 00:00","created_by":"apcelent","description":"sample notes edit","priority":4}))requests.delete('http://localhost:8080/notes/1')
These will create some notes in database using aiohttp REST API. These notes can be viewed athttp://127.0.0.1:8080/notes
The source code can be foundhere.
The article originally appeared onApcelent Tech Blog.
Top comments(3)

be aware this api shouldnt be considered as async as the whole code is not asyncronous specifically talking about database operations that uses sync methods with that being said ...
this can lead to :
- Race conditions and data corruptionWhen async and sync code are mixed, it can lead to race conditions and data corruption when accessing shared resources.
- Unexpected behavior and reduced performanceMixing async and sync code without understanding how they interact can lead to unexpected behavior and reduced performance.
- DeadlocksImproper use of async-await can lead to deadlocks.
- Thread pool starvationUsing Task.Run to offload synchronous code to the thread pool can lead to thread pool starvation.
if you want to create an async rest api refer tosqlalchemy async
i think that if this website is one of the first ones that comes after google search you should fix the code instead of saying this is actually an async rest api.
also your db queries are not safe instead of that use a context manager in order to ensure is a transaction
from contextlib import contextmanager as cm@cmdef sm(): SessionMaker = sessionmaker(bind=eng) session = SessionMaker() try: yield session except Exception as e: print("rollback transaction") session.rollback() raise finally: session.commit() session.close()`
then for queries is simpler to do:
def insertdata(value): with sm() as s:#we ensure it gets commited, or rolledback if there is a problem query = tablename(tablefield1=value) s.add(query)def selectdata(value): with sm() as s: result = s.query(tablename).filter(tablename.column==value).first() return result.columnamedef updatedata(value,valuetoupdate): with sm() as s: s.query(talbename).filter(tablename.column == value).first() tablename.columntoupdate = valuetoupdtedef deletedata(valueid): with sm() as s: s.query(tablename).filter(tablename.columnname==valueid).delete()
this way you ensure every single query wont share the same cursor

- LocationRio de Janeiro
- WorkSenior Software Engineer at OLX Brasil
- Joined
Hello, how are you@apcelent?
There's an issue about the way you wrote this API.
SQLAlchemy is not asyncio aware. This means that all the query listed above will make the python thread idle leading to inefficient I/O and scaling to one coroutine per thread. I strongly recommend a review of python asyncio model.
If you want use sqlAlchemy an alternative is heregithub.com/RazerM/sqlalchemy_aio.
Otherwise, is better get back to flask and use gevent to enable concurrent async I/O.
For further actions, you may consider blocking this person and/orreporting abuse