- Notifications
You must be signed in to change notification settings - Fork0
AI-powered client for OpenAI API to build NLP applications... Created athttps://coslynx.com
coslynx/OpenAI-API-Python-Client
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
- 📍 Overview
- 📦 Features
- 📂 Structure
- 💻 Installation
- 🏗️ Usage
- 🌐 Hosting
- 📄 License
- 👏 Authors
This repository provides a Minimum Viable Product (MVP) called "OpenAI-API-Python-Client". It offers a user-friendly Python backend API wrapper that simplifies the integration of OpenAI's powerful NLP capabilities into various projects. This MVP differentiates itself by focusing on simplicity and efficiency, making it ideal for developers of all skill levels.
| Feature | Description | |
|---|---|---|
| ⚙️ | Architecture | Utilizes a microservices architecture, with the API wrapper running as a standalone service. This provides flexibility and allows for independent scaling of components. |
| 📄 | Documentation | Provides detailed documentation, including API usage instructions, code examples, and tutorials. This streamlines onboarding and enables users to quickly learn and leverage the API. |
| 🔗 | Dependencies | Leverages various libraries such asfastapi,uvicorn,pydantic,openai,sqlalchemy,psycopg2-binary,alembic,pyjwt,requests,logging, andprometheus_client for essential functionalities. |
| 🧩 | Modularity | The codebase is organized into modules for user management, API interaction, data validation, database interactions, and utility functions, promoting code reusability and maintainability. |
| 🧪 | Testing | Includes a comprehensive testing framework, including unit tests, integration tests, and end-to-end tests. This ensures the quality, stability, and reliability of the codebase. |
| ⚡️ | Performance | Employs optimization techniques such as caching API responses, optimizing database queries, and asynchronous processing to ensure efficient operation. |
| 🔐 | Security | Implements security measures to protect user data and API keys, including secure storage of API keys, data encryption, and rate limiting. |
| 🔀 | Version Control | Uses Git for version control and employs a Gitflow branching model for a structured and collaborative development process. |
| 🔌 | Integrations | Integrates with popular cloud platforms like AWS or Azure for hosting the database and server infrastructure. |
| 📶 | Scalability | Designed for scalability, leveraging cloud-based solutions for automatic scaling and resource management. |
openai-api-client/├── api│ ├── routes│ │ ├── user.py│ │ └── openai.py│ └── schemas│ ├── user.py│ └── openai.py├── dependencies│ ├── auth.py│ ├── database.py│ ├── openai.py│ └── utils.py├── models│ ├── base.py│ ├── user.py│ └── api_usage.py├── services│ ├── user.py│ └── openai.py├── startup.sh├── commands.json├── tests│ ├── conftest.py│ ├── unit│ │ ├── test_openai.py│ │ └── test_user.py│ └── integration│ ├── test_openai_routes.py│ └── test_user_routes.py├── migrations│ └── versions│ └── ...│ └── ...│ └── alembic_version.py├── README.md├── .env.example├── .env├── gunicorn.conf.py├── Procfile├── .gitignore└── .flake8- Python 3.9 or higher
- PostgreSQL 14+
pip(Python package manager)alembic(Database migration tool)
Clone the repository:
git clone https://github.com/coslynx/OpenAI-API-Python-Client.gitcd OpenAI-API-Python-ClientInstall dependencies:
pip install -r requirements.txt
Set up the database:
- Create a PostgreSQL database and user if you don't already have one.
- Update the
DATABASE_URLin your.envfile with your database connection string. - Run database migrations:
alembic upgrade head
Configure environment variables:
- Create a
.envfile based on the.env.examplefile. - Replace placeholder values with your actual API keys and database credentials.
- Create a
Start the application server:
uvicorn main:app --host 0.0.0.0 --port 8000
.envfile: Contains environment variables likeOPENAI_API_KEY,DATABASE_URL, andSECRET_KEY.gunicorn.conf.py: Configures thegunicornweb server for deployment.
User Registration:
curl -X POST http://localhost:8000/api/v1/users/register \ -H"Content-Type: application/json" \ -d'{"username": "your_username", "email": "your_email@example.com", "password": "your_password"}'
User Login:
curl -X POST http://localhost:8000/api/v1/users/login \ -H"Content-Type: application/json" \ -d'{"email": "your_email@example.com", "password": "your_password"}'
Text Completion:
curl -X POST http://localhost:8000/api/v1/openai/complete \ -H"Content-Type: application/json" \ -H"Authorization: Bearer your_access_token" \ -d'{"text": "The quick brown fox jumps over the", "model": "text-davinci-003", "temperature": 0.7, "max_tokens": 256}'
Deploying to Heroku:
Install the Heroku CLI:
pip install -g heroku
Log in to Heroku:
heroku login
Create a new Heroku app:
heroku create openai-api-python-client-production
Set up environment variables:
heroku config:set OPENAI_API_KEY=your_openai_api_keyheroku config:set DATABASE_URL=your_database_urlheroku config:set SECRET_KEY=your_secret_key
Deploy the code:
git push heroku main
Run database migrations:
heroku run alembic upgrade head
OPENAI_API_KEY: Your OpenAI API key.DATABASE_URL: Your PostgreSQL database connection string.SECRET_KEY: A secret key for JWT authentication.
POST
/api/v1/users/register: Register a new user.Request Body:
{"username":"your_username","email":"your_email@example.com","password":"your_password"}Response Body:
{"id":1,"username":"your_username","email":"your_email@example.com","api_key":"your_api_key"}
POST
/api/v1/users/login: Login an existing user and obtain an access token.Request Body:
{"email":"your_email@example.com","password":"your_password"}Response Body:
{"access_token":"your_access_token","token_type":"bearer"}
GET
/api/v1/users/me: Get the current user's information.Authorization: Bearer your_access_token
Response Body:
{"id":1,"username":"your_username","email":"your_email@example.com","api_key":"your_api_key"}
POST
/api/v1/openai/complete: Complete a given text using OpenAI's text completion API.Authorization: Bearer your_access_token
Request Body:
{"text":"The quick brown fox jumps over the","model":"text-davinci-003","temperature":0.7,"max_tokens":256}Response Body:
{"response":"lazy dog."}
POST
/api/v1/openai/translate: Translate a given text using OpenAI's translation API.Authorization: Bearer your_access_token
Request Body:
{"text":"Hello world","source_language":"en","target_language":"fr"}Response Body:
{"response":"Bonjour le monde"}
POST
/api/v1/openai/summarize: Summarize a given text using OpenAI's summarization API.Authorization: Bearer your_access_token
Request Body:
{"text":"The quick brown fox jumps over the lazy dog.","model":"text-davinci-003"}Response Body:
{"response":"A brown fox jumps over a lazy dog."}
- Register a new user or login to receive a JWT access token.
- Include the access token in the
Authorizationheader for all protected routes using the format:Authorization: Bearer your_access_token
This Minimum Viable Product (MVP) is licensed under theGNU AGPLv3 license.
This MVP was entirely generated using artificial intelligence throughCosLynx.com.
No human was directly involved in the coding process of the repository: OpenAI-API-Python-Client
For any questions or concerns regarding this AI-generated MVP, please contact CosLynx at:
- Website:CosLynx.com
- Twitter:@CosLynxAI
Create Your Custom MVP in Minutes With CosLynxAI!
About
AI-powered client for OpenAI API to build NLP applications... Created athttps://coslynx.com
Topics
Resources
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.