Annotated version of this introductory video
Datasette is a tool for exploring and publishing data. It helps people take data of any shape, analyze and explore it, and publish it as an interactive website and accompanying API.
Datasette is aimed at data journalists, museum curators, archivists, local governments, scientists, researchers and anyone else who has data that they wish to share with the world. It is part of a wider ecosystem of 46 tools and 158 plugins dedicated to making working with structured data as productive as possible.
Try a demo and explore 33,000 power plants around the world, then follow the tutorial or take a look at some other examples of Datasette in action.
Then read how to get started with Datasette, subscribe to the monthly-ish newsletter and consider signing up for office hours for an in-person conversation about the project.
New: Datasette Desktop - a macOS desktop application for easily running Datasette on your own computer!
Exploratory data analysis
Import data from CSVs, JSON, database connections and more. Datasette will automatically show you patterns in your data and help you share your findings with your colleagues.
Instant data publishing
datasette publish
lets you instantly publish your data to hosting providers like Google Cloud Run, Heroku or Vercel.
Rapid prototyping
Spin up a JSON API for any data in minutes. Use it to prototype and prove your ideas without building a custom backend.
Latest news
7th October 2024 #
Python 3.13 was released today. Datasette 1.0a16 is compatible with Python 3.13, but Datasette 0.64.8 was not. The new Datasette 0.65 release fixes compatibility with the new version of Python.
5th August 2024 #
Datasette 1.0a14 includes some breaking changes to how metadata works for plugins, described in detail in the new upgrade guide. See also the annotated release notes that accompany this release.
18th February 2024 #
Datasette 1.0a10 is a focused alpha that changes some internal details about how Datasette handles transactions. The datasette.execute_write_fn()
internal method now wraps the function in a database transaction unless you pass transaction=False
.
16th February 2024 #
Datasette 1.0a9 adds basic alter table support to the JSON API, tweaks how permissions works and introduces some new plugin debugging utilities.
7th February 2024 #
Datasette 1.0a8 introduces several new plugin hooks, a JavaScript plugin system and moves plugin configuration from metadata.yaml
to datasette.yaml
. Read more about the release in the annotated release notes for 1.0a8.
1st December 2023 #
Datasette Enrichments is a new feature for Datasette that supports enriching data by running custom code against every selected row in a table. Read Datasette Enrichments: a new plugin framework for augmenting your data for more details, plus a video demo of enrichments for geocoding addresses and processing text and images using GPT-4.
30th November 2023 #
datasette-comments is a new plugin by Alex Garcia which adds collaborative commenting to Datasette. Alex built the plugin for Datasette Cloud, but it's also available as an open source package for people who are hosting their own Datasette instances. See Annotate and explore your data with datasette-comments on the Datasette Cloud blog for more details.
22nd August 2023 #
Datasette 1.0a4 has a fix for a security vulnerability in the Datasette 1.0 alpha series: the API explorer interface exposed the names of private databases and tables in public instances that were protected by a plugin such as datasette-auth-passwords, though not the actual content of those tables. See the security advisory for more details and workarounds for if you can't upgrade immediately. The latest edition of the Datasette Newsletter also talks about this issue.
15th August 2023 #
datasette-write-ui: a Datasette plugin for editing, inserting, and deleting rows introduces a new plugin adding add/edit/delete functionality to Datasette, developed by Alex Garcia. Alex built this for Datasette Cloud, and this post is the first announcement made on the new Datasette Cloud blog - see also Welcome to Datasette Cloud.
9th August 2023 #
Datasette 1.0a3 is an alpha release of Datasette that previews the new default JSON API design that’s coming in version 1.0 - the single most significant change planned for that 1.0 release.
1st July 2023 #
New tutorial: Data analysis with SQLite and Python. This tutorial, originally presented at PyCon 2023, includes a 2h45m video and an extensive handout that should be useful with or without the video. Topics covered include Python's sqlite3
module, sqlite-utils
, Datasette, Datasette Lite, advanced SQL patterns and more.
24th March 2023 #
I built a ChatGPT plugin to answer questions about data hosted in Datasette describes a new experimental Datasette plugin to enable people to query data hosted in a Datasette interface via ChatGPT, asking human language questions that are automatically converted to SQL and used to generate a readable response.
23rd February 2023 #
Using Datasette in GitHub Codespaces is a new tutorial showing how Datasette can be run in GitHub's free Codespaces browser-based development environments, using the new datasette-codespaces plugin.
28th January 2023 #
Examples of sites built using Datasette now includes screenshots of Datasette deployments that illustrate a variety of problems that can be addressed using Datasette and its plugins.
13th January 2023 #
Semantic search answers: Q&A against documentation with GPT3 + OpenAI embeddings shows how Datasette can be used to implement semantic search and build a system for answering questions against an existing corpus of text, using two new plugins: datasette-openai and datasette-faiss, and a new tool: openai-to-sqlite.
Latest releases
1st December 2024
llm 0.19 - A CLI utility and Python library for interacting with Large Language Models, including OpenAI, PaLM and local models installed on your own machine.
- Tokens used by a response are now logged to new
input_tokens
andoutput_tokens
integer columns and atoken_details
JSON string column, for the default OpenAI models and models from other plugins that implement this feature. #610 llm prompt
now takes a-u/--usage
flag to display token usage at the end of the response.llm logs -u/--usage
shows token usage information for logged responses.llm prompt ... --async
responses are now logged to the database. #641llm.get_models()
andllm.get_async_models()
functions, documented here. #640response.usage()
and async responseawait response.usage()
methods, returning aUsage(input=2, output=1, details=None)
dataclass. #644response.on_done(callback)
andawait response.on_done(callback)
methods for specifying a callback to be executed when a response has completed, documented here. #653- Fix for bug running
llm chat
on Windows 11. Thanks, Sukhbinder Singh. #495
29th November 2024
datasette 0.65.1 - An open source multi-tool for exploring and publishing data
- Fixed bug with upgraded HTTPX 0.28.0 dependency. #2443
23rd November 2024
sqlite-utils 3.38 - CLI tool and Python library for manipulating SQLite databases
- Plugins can now reuse the implementation of the
sqlite-utils memory
CLI command with the newreturn_db=True
parameter. (#643) table.transform()
now recreates indexes after transforming a table. A newsqlite_utils.db.TransformError
exception is raised if these indexes cannot be recreated due to conflicting changes to the table such as a column rename. Thanks, Mat Miller. (#633)table.search()
now accepts ainclude_rank=True
parameter, causing the resulting rows to have arank
column showing the calculated relevance score. Thanks, liunux4odoo. (#628)- Fixed an error that occurred when creating a strict table with at least one floating point column. These
FLOAT
columns are now correctly created asREAL
as well, but only for strict tables. (#644)
21st November 2024
llm 0.19a2 - A CLI utility and Python library for interacting with Large Language Models, including OpenAI, PaLM and local models installed on your own machine.
llm.get_models()
andllm.get_async_models()
functions, documented here. #640
20th November 2024
llm 0.19a1
response.usage()
and async responseawait response.usage()
methods, returning aUsage(input=2, output=1, details=None)
dataclass. #644
llm 0.19a0
- Tokens used by a response are now logged to new
input_tokens
andoutput_tokens
integer columns and atoken_details
JSON string column, for the default OpenAI models and models from other plugins that implement this feature. #610 llm prompt
now takes a-u/--usage
flag to display token usage at the end of the response.llm logs -u/--usage
shows token usage information for logged responses.llm prompt ... --async
responses are now logged to the database. #641
17th November 2024
llm 0.18
- Initial support for async models. Plugins can now provide an
AsyncModel
subclass that can be accessed in the Python API using the newllm.get_async_model(model_id)
method. See async models in the Python API docs and implementing async models in plugins. #507 - OpenAI models all now include async models, so function calls such as
llm.get_async_model("gpt-4o-mini")
will return an async model. gpt-4o-audio-preview
model can be used to send audio attachments to the GPT-4o audio model. #608- Attachments can now be sent without requiring a prompt. #611
llm models --options
now includes information on whether a model supports attachments. #612llm models --async
shows available async models.- Custom OpenAI-compatible models can now be marked as
can_stream: false
in the YAML if they do not support streaming. Thanks, Chris Mungall. #600 - Fixed bug where OpenAI usage data was incorrectly serialized to JSON. #614
- Standardized on
audio/wav
MIME type for audio attachments rather thanaudio/wave
. [#603](https://github.com/simonw/llm/issues/603
14th November 2024
llm 0.18a1
- Fixed bug where conversations did not work for async OpenAI models. #632
__repr__
methods forResponse
andAsyncResponse
.
datasette-write-ui 0.0.1a11
datasette-dashboards 0.7.0 - Datasette plugin providing data dashboards from metadata
llm 0.18a0 - A CLI utility and Python library for interacting with Large Language Models, including OpenAI, PaLM and local models installed on your own machine.
Alpha support for async models. #507
Multiple smaller changes.
8th November 2024
sqlite-utils 3.38a0 - CLI tool and Python library for manipulating SQLite databases
- Plugins can now reuse the
sqlite-utils memory
command with the newreturn_db=True
parameter. #643
1st November 2024
llm 0.17.1 - A CLI utility and Python library for interacting with Large Language Models, including OpenAI, PaLM and local models installed on your own machine.
- Fixed a bug where
llm chat
crashes if a follow-up prompt is provided. #601
29th October 2024
llm 0.17
Support for attachments, allowing multi-modal models to accept images, audio, video and other formats. #587
The default OpenAI gpt-4o
and gpt-4o-mini
models can both now be prompted with JPEG, GIF, PNG and WEBP images.
Attachments in the CLI can be URLs:
llm -m gpt-4o "describe this image" \
-a https://static.simonwillison.net/static/2024/pelicans.jpg
Or file paths:
llm -m gpt-4o-mini "extract text" -a image1.jpg -a image2.jpg
Or binary data, which may need to use --attachment-type
to specify the MIME type:
cat image | llm -m gpt-4o-mini "extract text" --attachment-type - image/jpeg
Attachments are also available in the Python API:
model = llm.get_model("gpt-4o-mini")
response = model.prompt(
"Describe these images",
attachments=[
llm.Attachment(path="pelican.jpg"),
llm.Attachment(url="https://static.simonwillison.net/static/2024/pelicans.jpg"),
]
)
Plugins that provide alternative models can support attachments, see Attachments for multi-modal models for details.
The latest llm-claude-3 plugin now supports attachments for Anthropic's Claude 3 and 3.5 models. The llm-gemini plugin supports attachments for Google's Gemini 1.5 models.
Also in this release: OpenAI models now record their "usage"
data in the database even when the response was streamed. These records can be viewed using llm logs --json
. #591
28th October 2024
llm 0.17a0
Alpha support for attachments, allowing multi-modal models to accept images, audio, video and other formats. #578
Attachments in the CLI can be URLs:
llm "describe this image" \
-a https://static.simonwillison.net/static/2024/pelicans.jpg
Or file paths:
llm "extract text" -a image1.jpg -a image2.jpg
Or binary data, which may need to use --attachment-type
to specify the MIME type:
cat image | llm "extract text" --attachment-type - image/jpeg
Attachments are also available in the Python API:
model = llm.get_model("gpt-4o-mini")
response = model.prompt(
"Describe these images",
attachments=[
llm.Attachment(path="pelican.jpg"),
llm.Attachment(url="https://static.simonwillison.net/static/2024/pelicans.jpg"),
]
)
Plugins that provide alternative models can support attachments, see Attachments for multi-modal models for details.