- Notifications
You must be signed in to change notification settings - Fork1
License
cloudquery/python-plugin-template
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
This repo contains everything you need to get started with building a new plugin.To get started all you need to do is change a few names, define some tables, and write an API Client to populate the tables.
- plugin/tables/items.py
Items- A boilerplate table definitionItemResolver- A boilerplate table resolver
- plugin/example/client.py
ExampleClient- A boilerplate API Client
- [plugin/client/client.py]
Spec- Defines the CloudQuery ConfigClient(uses:ExampleClient) - Wraps your API Client
- plugin/plugin.py
ExamplePlugin- The plugin registration / how CloudQuery knows what tables your plugin exposes.
The first thing you need to do is identify the tables you want to create with your plugin.Conventionally, CloudQuery plugins have a direct relationship between tables and API responses.
For example:If you had an API endpointhttps://api.example.com/items/{num} and for each value ofnum it provided an object
{"num": {{num}},"date":"2023-10-12","title":"A simple example"}Then you would design the table class as
classItems(Table):def__init__(self)->None:super().__init__(name="item",title="Item",columns=[Column("num",pa.uint64(),primary_key=True),Column("date",pa.date64()),Column("title",pa.string()), ], ) ...
Creating one table for each endpoint that you want to capture.
Next you'll need to define how the tables are retrieved, it's recommended to implement this as a generator, as per the example inplugin/example/client.py.
Having written your API Client you will have, identified the authentication and/or operational variables needed.Adding these to the CloudQuery config spec can be done by editing theSpecdataclass using standard python, and adding validation where needed.
Finally, you need to edit theplugin.py file to set the plugin name and version, and add theTables to theget_tables function.
To test your plugin you can run it locally.
To automatically manage your virtual environment and install the dependencies listed in thepyproject.toml you can usepoetry.Poetry is an improved package & environment manager for Python that uses the standardisedpyproject.toml, if you don't have it installed you can pull it withpip install poetry.
To install the dependencies into a new virtual environment runpoetry install.If you have additional dependencies you can add them withpoetry add {package_name} which will add them to thepyproject.toml and install them into the virtual environment.
Then to run the pluginpoetry run main serve, which will launch the plugin manually as a GRPC service.
With that running you can adjust theTestConfig.yaml to match your plugin and runcloudquery sync.This should result in the creation of a sqlite databasedb.sqlite where you can validate your tables are as expected.
- Update the plugin metadata inplugin/plugin.py to match your team and plugin name.
- Run
python main.py package -m "Initial release" "v0.0.1" ..-mspecifies changelog andv0.0.1is the version. - Run
cloudquery plugin publish -fto publish the plugin to the CloudQuery registry.
More about publishing pluginshere
About
Resources
License
Code of conduct
Security policy
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Contributors6
Uh oh!
There was an error while loading.Please reload this page.
