Listen API Announcements

Stay informed + API changelog. If you are an API user, you'll automatically get email notifications with new announcements.

Some API improvements. No action is required on your end.

May 20, 2019

[You can read the detailed version of this announcement on our blog]

Better documentation

Recently, we made OpenAPI spec for our API documentation. You can take a quick look at Listen API’s OpenAPI spec in YAML or JSON. We generate the main documentation page from the standardized OpenAPI spec. Specifically, you can see the API response schema (e.g., data type, example value, default value, description…) on the page.

Furthermore, we embed Runkit widgets on the main documentation page to make it easy for you to test API endpoints and preview the actual response.

Better performance

We did some infrastructure improvements (e.g., code optimization, server upgrade…) and reduced the average API response time from ~100 ms to ~85 ms. You can always go to our status page to monitor the realtime health of Listen API.

Bigger thumbnail images

We made the thumbnail images bigger (the “thumbnail” field in the API response)! Previously the thumbnail image was 150x150, which was very blurry on modern smartphone screens :) Now the image is 300x300.

Fetch up to 10 latest episodes for multiple podcasts with one API request

May 9, 2019

We just added a new parameter show_latest_episodes to POST /api/v2/podcasts (Batch fetch basic meta data for podcasts). If show_latest_episodes is 1, then the response will include a field "latest_episodes", which is a list of up to 10 latest episodes of the podcasts in this batch, sorted by pub_date_ms in an descending order.

This could be used to quickly check if there are new episodes for a list of subscribed podcasts.

Fields updates: audio_length, genres, and latest_pub_date_ms

May 2, 2019

Due to technical debt, we've got three inconsistent fields across multiple API endpoints:

1. audio_length was a numeric value (in seconds, e.g., 1234) for an endpoint, but was a human-readable string (e.g., 00:02:22) for another endpoint. Now we introduce a new field "audio_length_sec" to replace "audio_length". audio_length_sec is a numeric value (in seconds, e.g., 1234).

2. genres was an array of genre ids for an endpoint, but was an array of genre names for another endpoint. Now we introduce a new field "genre_ids" to replace "genres". genre_ids is an array of genre ids.

3. Previously we had a field "lastest_pub_date_ms" with typo "lastest". Now we introduce a new field "latest_pub_date_ms" to replace "lastest_pub_date_ms".

What happen to the old fields (i.e., audio_length, genres, and lastest_pub_date_ms)? For users who signed up before May 2, 2019, you can continue using those old fields. Your code won't break. For new users who signed up after May 2, 2019, those old fields are gone from the API response (and from the documentation) - we don't want to confuse new users :)

We would encourage you to use the new fields audio_length_sec, genre_ids, and latest_pub_date_ms. We'll completely remove the old fields (i.e., audio_length, genres, and lastest_pub_date_ms) from future major upgrades (v3, v4, v5...).

Batch fetching podcasts by iTunes ids

April 29, 2019

POST /api/v2/podcasts allows you to fetch up to 10 podcasts with one single request. In addition to using Listen Notes ids and rss urls, you can provide iTunes ids now as parameters now.

Why do people want to look up podcasts using iTunes ids? Well, some PR/ads agencies already have a list of iTunes ids and they want an easy way to get podcast meta data from these iTunes ids.

New endpoint to submit a podcast to Listen Notes

April 9, 2019

This new endpoint allows you to submit a RSS url to Listen Notes: POST /api/v2/podcasts/submit . With this endpoint, your users can directly submit new podcasts to Listen Notes from your apps or services (especially podcast hosting services).

If the RSS url exists in our database, the endpoint will return basic meta data of that podcast immediately. Otherwise, we'll review and add the podcast to our database within 12 hours.