Skip to main content
Category

Blog

A Good Travel Experience Begins with One Single Booking Portal – How Open APIs Are Leading the Way

By Blog

For more information on the OpenAPI Initiative and #sig-travel, join the conversation on Slack. https://open-api.slack.com/archives/C0122NPKUR2

Today’s traveler is increasingly shopping for an experience, not just a seat on a plane or bed in a hotel. In fact, the travel experience begins from the booking environment itself where consumers expect a single portal to connect across travel providers and travel retailers. People are looking for the convenience and value of the delivery and ride sharing apps that they use in their day-to-day lives and want that applied to the travel experience.

The reason the travel industry has lagged behind modern expectations is because it has lagged behind the latest technological trends. It’s that simple.

Creating an End-to-End Journey is Hard

The process and protocols largely used today between providers have their base in travel standards agreed to in the 1960s and modified over time. Initially created to address airline interlining, allowing a single ticket to include flights on multiple air carriers, they have been pressed into service across the travel verticals.

As time has gone on the once workable solution has begun to show its limitations. There are multiple efforts in the industry to break with the past and pursue an approach where a travel offer may be handled much like any other retail offering in the digital space.

Travel, however, has some unique needs as products are transient (an empty seat is worthless after departure time) and in most cases specific to location (airplanes must land at an airport). Also, in most cases a travel product must be combined with another travel product to satisfy a request to create a trip.

Creating an end-to-end journey implies a combined offer from multiple offers proposed and/or serviced by multiple providers. This adds a level of complexity, maintaining relationships between offers, other retail categories avoid.

Key Takeaway: Focusing on the experience as a new approach to travel retail requires a new level of interoperability among participants in the travel market.

Addressing Interoperability Will Open Up Broad Opportunities

Solving the issues to support experience based, total trip, retail at scale could unlock massive economic opportunities for many of the current distribution channels operating today or create new ones. Mainstream distribution channels focus mainly on air which in the US had a total operating revenue of $120[1] billion. However, the total US travel revenue for 2019 was $1.1[2] trillion.

Much of that figure is consumers figuring out for themselves how to make arrangements, a huge, missed opportunity to leverage automation. To build the experiences people are looking for, outside of immediate travel and lodging portals like Expedia, consumers are bouncing between websites and calendars to find restaurants, museum passes and jet ski rentals.

The reason we aren’t seeing a wider variety of offerings is not hard to understand: For the mainline distributors it’s not worth the effort to connect, maintain and monitor small suppliers thru bespoke APIs.

By unlocking interoperability, providers of travel products and services would have access to channels they are currently shut out of due to costs and complexity. The public is eager to get out of the house and experience the world again but would like to avoid the odious task of DIY travel orchestration and management. They want experience led retailing which industries like hospitality are investing in but only at the property level, not at the full trip level.

The travel industry cannot afford to allow API chaos to continue to be a barrier to more effective retailing.

Getting Alignment on the Solution is Critical

In response, the OpenTravel Alliance (OTA) and the OpenAPI Initiative (OAI) will work together to focus on API conventions and standards, not just messages. Within OAI, there is now a special interest group to focus on travel issues (#sig-travel).

The Travel SIG will be the conduit for the needs of the travel industry that pertain to the Open API Specification (OAS). The OAS is a broad specification intended to help developers solve real world business issues with as much flexibility as possible.

What is needed for interoperability and to reduce API chaos and hence distribution costs in travel is more consistency in API behaviors. OpenTravel will take the lead on providing open-source tooling and publishing reference architectures with refence implementations that adhere to the OAS. OTA 2.0 with its model driven approach will form the basis of this more comprehensive approach that supports all travel verticals. This will be in cooperation with existing travel standards bodies and trade associations.

The overriding goal will be to lower the cost of connectivity to publish, acquire, distribute, and market digital travel products.

What Can I do?

Join the conversation and help build a more modern and seamless travel industry! For more information on the OpenAPI Initiative and #sig-travel, join the conversation on Slack. https://open-api.slack.com/archives/C0122NPKUR2

For more information on OTA 2.0, including becoming an Open Travel member, go to www.opentravel.org or contact Jeff ErnstFriedman at jeff.ernstfriedman@opentravel.org.


[1] Source: Phocuswright White Paper, Air Sales and the Travel Agency Distribution Channel April 2019

[2] Source: U.S. TRAVEL AND TOURISM OVERVIEW (2019), US Travel Association.

Kubeshop, Accelerator/Incubator for Kubernetes, Joins OpenAPI Initiative

By Announcement, Blog

The OpenAPI Initiative, the consortium of forward-looking industry experts focused on evolving and implementing the OpenAPI Specification (OAS), is announcing today that Kubeshop.io has joined as a new member. Kubeshop’s CTO, Ole Lensmar, served as the chairman of the OpenAPI Initiative from its inception in 2016 until 2020. We welcome back the familiar face! 

Kubeshop is an open-source accelerator/incubator focused on Kubernetes (“K8s”). Kubeshop identifies needs and offers resources and guidance to projects within the k8s space, specifically focusing on helping Developers, Testers, and DevOPS engineers with tooling that strives to make complex workflows simpler and more productive. Currently, Kubeshop is home to Kusk, Kubtest, and Monokle:

  • Kusk allows users to work with an OpenAPI specification as the source of truth for Kubernetes manifests, simplifying cluster configuration and management across a number of popular Kubernetes Ingress Controllers
  • Kubtest provides a framework for testing of applications and APIs running under Kubernetes, allowing teams to decouple test orchestration and execution from ci/cd tooling and centralize test-result management and analysis
  • Monokle simplifies everyday tasks and workflows related to management and debugging of k8s manifests 

“Joining and supporting the OAI was an obvious step for us; the OAI is central to the continued adoption and evolution of the OpenAPI Specification, and we are excited about the prospect of housing projects targeting OpenAPI specific needs and Kubernetes workflows,” said Ole Lensmar, CTO, Kubeshop.

We look forward to watching Kubeshop grow and evolve, and help out k8s developers! 

OpenAPI Resources

To learn more about participate in the evolution of the OpenAPI Specification: https://www.openapis.org/participate/how-to-contribute

About the OpenAPI Initiative

The OpenAPI Initiative (OAI) was created by a consortium of forward-looking industry experts who recognize the immense value of standardizing on how APIs are described. As an open governance structure under the Linux Foundation, the OAI is focused on creating, evolving and promoting a vendor neutral description format. The OpenAPI Specification was originally based on the Swagger Specification, donated by SmartBear Software. To get involved with the OpenAPI Initiative, please visit https://www.openapis.org

About Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation projects like Linux, Kubernetes, Node.js and more are considered critical to the development of the world’s most important infrastructure. Its development methodology leverages established best practices and addresses the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

Arnaud Lauret, the API Handyman, Covers His Top 4 Must-Attend Sessions at ASC 2021

By Blog, Events

We talked with Arnaud Lauret, well known as the API Handyman and author of The Design of Web APIs, to find out more about the upcoming ASC 2021 (Sept 28-29)

Lauret is presenting “Taking advantage of OpenAPI for API Design reviews” on Tuesday, September 28 starting at 11:20am PDT.

Lauret is a Senior API Architect at Natixis, based in Paris, France, currently helping everyone from executives to developers to understand what APIs are, why they matter and “how to do them.” He helps teams design their APIs by reviewing and challenging their designs, providing training and building API design guidelines.

During your live API design review demo session at ASC 2021, what are the top 3 key API design principles you will be covering?

Inside/Out, semantic, and consistency. 

An API that is just a technical connector giving access to the underlying mess, throwing bluntly the inside to the outside, will be a total nightmare for consumers but also for providers. On the consumer side, people will get a complex API requiring a high level of expertise, requiring much effort to do simple things. On the provider side, you may have to fear unexpected consequences such as data corruption or security breaches.

Semantic is very important, the meaning of words is critical when designing an API. Choose the wrong word and people may not understand your intent, what the API is supposed to do, which value to provide or what they can do with returned data.

And last not but least of this top 3: consistency. Designing a consistent API is hard but it’s worth the cost. Being consistent with the rest of the world will make developers feel at home when they see your API for the first time. Being consistent across your APIs and within them will make developers able to make connections easily between operations, data models, or parameters and so they will be able to master your API without even thinking about it.  

Can an API design review be based on facts, not opinions? Is this possible?

Yes. It’s not always that easy, but that’s what I try to do. An API design review based on baseless opinions has absolutely no interest. People don’t care if you prefer blue over orange. The question is why blue is better than orange in a given context. Everything that is proposed must be backed with reasonable explanations, facts. 

Having guidelines is really a great help: those predefined rules provide factual arguments that everyone agrees on. But guidelines can’t solve all design questions, they provide only high level guidance, there always will be more specific functional/business questions leading to multiple possible designs. Each one conforming to guidelines. But which one is the “right” one? You’ll have to find “facts” to make a decision, you can take advantage of your knowledge of business rules, what will come in the near future, … As long as a solution has a valid explanation everyone agrees on, that’s a review based on facts.

What other presentations will you be attending at ASC 2021 and why?

Actually, I wish I could attend them all! There are so many interesting sessions, but here’s my top 4: 

  • API Terms of Service : From Creative commons to Machine readability – Célya Gruson-Daniel, COSTECH & Mehdi Medjaoui: I’m a “machine-readability nerd,” I love this idea of trying to structure the unstructured because it simplifies both machines’ and humans’ job. Bringing machine-readability to TOS could have a major impact for me. It could simplify the comparison between service providers; and as a person participating to call for proposal processes to choose software solutions, I would be very happy with just that. And I’m sure there are other totally crazy outcomes. I can’t wait to attend to this one!  
  • We brought OpenAPI Docs into our service catalog. Now what? – Shai Sachs & Zoe Song, Wayfair: I work in a large organisation, many different teams, many different APIs and that is not easy. The Wayfair session resonates with my context and they will also talk about Backstage open source service catalog that I’m interested in.
  • Finding Ways to Measure the Complexity of an API Design – Stephen Mizell, API Consultant: I help people design APIs and I design APIs. It’s not always easy to evaluate if a design is complex or not. Sometimes it just feels complex or simple, and I don’t like that, relying on just baseless feeling. I would prefer to back my evaluation by something more concrete, true facts if possible. That’s why I really look forward to what Stephen has to say on that topic.
  • Mistakes to avoid during API Specification Reviews – Rahul Dighe, PayPal: I have done hundreds of API design reviews. I have learned much, found solutions to various situations, but sometimes they don’t work or I can face totally new situations. That’s why I’m always interested in learning from what others do because they may have found other solutions to the same problems or be in a totally different context facing different situations. 

Eliminating Technological Barriers with OpenAPI New Member apiplatform.io

By Announcement, Blog

The OpenAPI Initiative welcomes apiplatform.io, a productivity management platform that allows users to build and manage backend applications with no code. With apiplatform.io, users can create and host fully functional workflows and APIs to move legacy databases to the cloud. 

apiplatform boasts an extensive client list including Lamico Systems and Site Lantern.

apiplatform allows users to go from idea to product quickly. Thanks to their fully automated but highly configurable model, databases can be efficiently migrated according to the OpenAPI Specification (OAS). By allowing users to modernize databases with speed, apiplatform lessens user costs without sacrificing quality. 

We sat down with co-founder and CEO Muthu Raju to learn more about what apiplatform is currently working on, and why they joined the OpenAPI Initiative. 

Why are you joining the OpenAPI Initiative, and why now?

At apiplatform.io, we automate all the steps in the API development cycle so that our users can achieve integration and digital transformation at speed. We automate API generation and deployment based on API Specification provided as input to our platform.

The OpenAPI Initiative is focused on creating and promoting a vendor-neutral description format. Therefore, we feel OpenAPI Initiative is the right place to collaborate and arrive at a description format for building business logic, workflows, and continuous automation in our platform.

In your mission statement, you talk about eliminating technology barriers. How are you doing this?

We automate the developer lifecycle processes for a given API Specification by generating the code for the specification, testing and deploying them to a cloud platform, and setting up a gateway for secured access control. So, the end-user doesn’t need to understand how to use the tools, platforms, services, and technologies to operationalize their API or invest their time in coding, setting up the continuous delivery pipeline, etc. One can jump from design to delivery of API for a given specification!

apiplatform.io developed the Covid19 Virtual Assistant to help people find and use COVID-19 data. How does it work, and how does someone access it?

https://covid19.apiplatform.io/ was developed and deployed in early 2020 by integrating the  COVID-19 Dashboard by the Center for Systems Science and Engineering (CSSE) and Johns Hopkins University (JHU) along with a Virtual Assistant “GOVID” to assist people looking for food, medicine, online consultation, etc. This Virtual Assistance was helping people until the Indian Government authorized only a few private and government bodies to host such services during the pandemic.

What’s important about the new OpenAPI Specification 3.1.0 for apiplatform.io? What are the challenges for broader adoption?

OpenAPI Specification, be it 3.1.0 or any other version, will be the primary specification supported by apiplatform.io which means we automate the development lifecycle of API based on the specification. Therefore, we don’t foresee any challenge as such for the broader adoption.

OpenAPI Resources

To learn more about participate in the evolution of the OpenAPI Specification: https://www.openapis.org/participate/how-to-contribute

About the OpenAPI Initiative

The OpenAPI Initiative (OAI) was created by a consortium of forward-looking industry experts who recognize the immense value of standardizing on how APIs are described. As an open governance structure under the Linux Foundation, the OAI is focused on creating, evolving and promoting a vendor neutral description format. The OpenAPI Specification was originally based on the Swagger Specification, donated by SmartBear Software. To get involved with the OpenAPI Initiative, please visit https://www.openapis.org

About Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation projects like Linux, Kubernetes, Node.js and more are considered critical to the development of the world’s most important infrastructure. Its development methodology leverages established best practices and addresses the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

The State of API Development and the Upcoming ASC 2021 – A closeup with Mandy Whaley, Azure Dev Tools, Microsoft

By Blog, Events

ASC 2021 is being held virtually Sept 28-29 this year with an incredible array of API experts, users, and enthusiasts. OpenAPI Specification (OAS), RAML, Blueprint, gRPC, OData, JSON Schema, GraphQL, and AsynchAPI will all be topics at ASC 2021, enabling attendees to get familiar with these formats and discuss how to use them in practice.

The event has its origins in the API Strategy and Practice Conference (APIStrat) which ran for many years and became part of the OpenAPI Initiative in 2016. The collaborative spirit and community from APIStrat continue with ASC, and we look forward to many lively conversations and debates this year!

To find out more about ASC 2021, we talked with Mandy Whaley, Partner Director of Product, Azure Developer Tools at Microsoft. Whaley is a life-long software developer who has worked in development teams of all sizes and types. The team she leads at Microsoft builds the Microsoft Azure SDK, Azure dev tools for Visual Studio and VS Code, and works with groups across the company on API design and developer experience. Whaley will be a keynote speaker at ASC 2021 and is a great example of the type of skilled, experienced, and – most importantly – accessible people who come and participate in ASC every year. 

What’s the biggest problem with API development in 2021? 

The biggest challenge right now is the tension between the demand for development velocity and the need to design and build consistent, easy to use APIs that are long lasting and stable. It’s a balancing act. 

This isn’t a new problem, of course, but APIs exist now in more layers and in more places than ever before. That means there are more teams involved and more dependencies.  Outcome focused, customer-centric API design powered by tools that help teams understand how developers actually use the APIs is critical. 

Many API development teams are also facing challenges related to scale, throttling, security, and long running operations. These are all areas where the API community has the opportunity to define patterns and practices that will help both API producers and consumers. 

How will API development be different in a year from now? 3 years from now? 

APIs are becoming a central part of how every team builds software. We see this happening as more teams adopt microservices, and as more companies rely on both internal and public APIs for core parts of the business. The types of APIs we build are also changing, and teams need to understand how to expand their API guidelines and design practices beyond REST. Over the next three years, API development is going to mature across all dimensions including security, tooling, testing, design, and observability. I am excited to be a part of the community working to create these new capabilities. 

How important are API skills for getting hired into your Dev Tools teams at Microsoft? 
Our team works on a broad scope of developer experience topics in the Developer Division at Microsoft.  We build VS Code and Visual Studio extensions as well as the Azure SDK. We also lead our Azure API Guidelines and architecture reviews. We work with teams on REST API design and on language-specific APIs for Python, JavaScript, Java, Go, and .NET. API skills are important for both our product management and engineering team members because every team member needs to be able to think through how an API or SDK detail will affect the developer experience. We look for API skills when hiring for senior level positions, and we mentor team members to help them grow their API skills. 

What do you personally hope to get out of presenting at ASC 2021? 

Practitioner-lead conferences are my favorite type of event. I am looking forward to connecting with other people who think deeply about APIs and work with all the possibilities and challenges related to designing, building, and maintaining APIs. I have learned so much from the API community and I am excited to give back by helping build a community where we all can learn from each other. 

Apart from presenting, is there one presentation in particular at ASC 2021 that you want to attend? 
I am eager to attend the full event and am blocking off time so I can attend with the same focused mindset that I would have at an in-person event – except with the bonus that I can be with my dogs and eat my favorite snacks at home. 

Here are just a few of the sessions that I am really excited about: 


OpenAPI Resources

To learn more about participate in the evolution of the OpenAPI Specification: https://www.openapis.org/participate/how-to-contribute

About the OpenAPI Initiative

The OpenAPI Initiative (OAI) was created by a consortium of forward-looking industry experts who recognize the immense value of standardizing on how APIs are described. As an open governance structure under the Linux Foundation, the OAI is focused on creating, evolving and promoting a vendor neutral description format. The OpenAPI Specification was originally based on the Swagger Specification, donated by SmartBear Software. To get involved with the OpenAPI Initiative, please visit https://www.openapis.org

About Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation projects like Linux, Kubernetes, Node.js and more are considered critical to the development of the world’s most important infrastructure. Its development methodology leverages established best practices and addresses the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

JSON Schema bundling finally formalised

By Blog

This article was first published on the JSON Schema Blog and is canonically located at https://json-schema.org/blog/posts/bundling-json-schema-compound-documents

I’ve been known to say “If you haven’t rewritten your OpenAPI bundling implementation recently, then you don’t support OpenAPI 3.1”. This observation may be true, but perhaps some more detail would be helpful? When implementing support for OAS 3.1 and JSON Schema draft 2020-12 in oas-kit, reading the sections of the JSON Schema spec on bundling compound documents, I still wasn’t totally clear on what was expected of compliant tooling. Thankfully, Ben Hutton is here to set the record straight with a worked example. – Mike Ralphson, OAI TSC

Bundling has renewed importance

OpenAPI has long since put the spotlight on JSON Schema, and the release of OpenAPI 3.1 has huge implications for the future of both projects. I’m truly excited.

Developers of platforms and libraries that use OpenAPI haven’t had such a shake up before, and my feeling is it may take more than a few releases to correctly implement all the new shiny features full JSON Schema has to offer.

While the number of changes from JSON Schema draft-04 to draft 2020-12 are vast and the subject of more blog posts than are likely interesting, one of the key “features” of draft 2020-12 is a defined bundling process. (draft-04 is the version of JSON Schema that OAS used prior to version 3.1.0; or rather, a subset/superset of it.)

Indeed, bundling, if anything, is going to be more important to get right than ever. OAS 3.1 ushering in full JSON Schema support dramatically increases the likelihood that developers with existing JSON Schema documents will use them by reference in new and updated OpenAPI definitions. Ultimate source of truth matters, and it’s often the JSON Schemas.

Many tools don’t support referencing external resources. Bundling is a convenient way to package up schema resources spread across multiple files in a single file for use elsewhere, such as an OpenAPI document.

Existing solutions? New solutions!

There are several libraries which offer bundling solutions, however they all have caveats, and I haven’t seen any to date which are fully JSON Schema aware. The most popular of these libraries is called json-schema-ref-parser, however it reports that it was not intended to be JSON Schema aware, and is only intended to cover the JSON Reference specification (Which has been bundled back into the JSON Schema specification now).

We are hoping to provide you with a canonical implementation (Right, Mike?!) and enough information to get started building your own in your language of choice. (Although, it’s always best to read the full specification when developing implementations.)

Bundling fundamentals

Firstly, let’s visit some key definitions in JSON Schema draft 2020-12. The $id  keyword is used to identify a “schema resource”. In the example below, the $id is https://jsonschema.dev/schemas/mixins/integer  for the resource.

{
  "$id": "https://jsonschema.dev/schemas/mixins/integer",
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "description": "Must be an integer",
  "type": "integer"
}

A “Compound Schema Document” is a JSON document which has multiple embedded JSON Schema Resources. Below is a simplified example of one we’ll unpack a bit later.

{
  "$id": "https://jsonschema.dev/schemas/examples/non-negative-integer-bundle",
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "description": "Must be a non-negative integer",
  "$comment": "A JSON Schema Compound Document. Aka a bundled schema.",
  "$defs": {
    "https://jsonschema.dev/schemas/mixins/integer": {
      "$schema": "https://json-schema.org/draft/2020-12/schema",
      "$id": "https://jsonschema.dev/schemas/mixins/integer",
      "description": "Must be an integer",
      "type": "integer"
    },
    "https://jsonschema.dev/schemas/mixins/non-negative": {
      "$schema": "https://json-schema.org/draft/2020-12/schema",
      "$id": "https://jsonschema.dev/schemas/mixins/non-negative",
      "description": "Not allowed to be negative",
      "minimum": 0
    },
    "nonNegativeInteger": {
      "allOf": [
        {
          "$ref": "/schemas/mixins/integer"
        },
        {
          "$ref": "/schemas/mixins/non-negative"
        }
      ]
    }
  },
  "$ref": "#/$defs/nonNegativeInteger"
}

Last, let’s look at the carefully crafted definition of “bundling” according to the JSON Schema specification:

“The bundling process for creating a Compound Schema Document is defined as taking references (such as “$ref”) to an external Schema Resource and embedding the referenced Schema Resources within the referring document. Bundling SHOULD be done in such a way that all URIs (used for referencing) in the base document and any referenced/ embedded documents do not require altering.”

With these definitions in mind, now we can look at the defined bundling process for JSON Schema resources! We will only cover the ideal situation in this article. The goal here is to have no external Schema Resources.

Note, this article does NOT cover “total dereferencing”, which is removing all uses of $ref from a schema. This is not advised, and is not always even possible, such as when there are self references.

Bundling Simple External Resources

In our first example, we have an ideal situation for bundling. Each schema has an $id and $schema defined, making the bundling process simple. We’ll cover various other situations and edge cases in further examples, but having each resource define its own identity and dialect is always preferable. Our primary schema resource references two other schema resources using the in-place applicator $ref with the value being a relative URI. The relative URI is resolved against the base URI, which in this instance is found in the primary schema resource’s $id value. By combining “integer” and “non-negative” schemas, we create a “non-negative integer” schema.

{
  "$id": "https://jsonschema.dev/schemas/mixins/integer",
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "description": "Must be an integer",
  "type": "integer"
}
{
  "$id": "https://jsonschema.dev/schemas/mixins/non-negative",
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "description": "Not allowed to be negative",
  "minimum": 0
}
{
  "$id": "https://jsonschema.dev/schemas/examples/non-negative-integer",
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "description": "Must be a non-negative integer",
  "$comment": "A JSON Schema that uses multiple external references",
  "$defs": {
    "nonNegativeInteger": {
      "allOf": [
        {
          "$ref": "/schemas/mixins/integer"
        },
        {
          "$ref": "/schemas/mixins/non-negative"
        }
      ]
    }
  },
  "$ref": "#/$defs/nonNegativeInteger"
}

Should “non-negative-integer” schema be used as the primary schema in an implementation, the other schemas would need to be available to the implementation. At this point, exactly how that implementation loads in the schemas doesn’t matter, as they have fully qualified URIs as their identity defined in $id. Any implementation that loads in schemas should build an internal local index of schema URIs defined in $id to schema resources.

Remember, any schema which provides a value for $id is considered a Schema Resource.

Let’s resolve (dereference) one of the references in our primary schema. “$ref”: “/schemas/mixins/integer” resolves to a fully qualified URI of https://jsonschema.dev/schemas/mixins/integer by following the rules for first determining the base URI and then resolving the relative URI against that base URI. The implementation should then check its internal index of schema identifiers and schema resources, finding a match, and using the appropriate previously loaded schema resource.

The bundling process is done. The previously externally referenced schemas are copied into $defs in our primary schema, as is. The keys for the $defs object are the identifying URIs, but they can be anything, as those values won’t be referenced (They could be UUIDs if you like). Looking at our final bundled schema… I mean “Compound Schema Document”, we now have multiple Schema Resources embedded in a single Schema document.

{
  "$id": "https://jsonschema.dev/schemas/examples/non-negative-integer-bundle",
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "description": "Must be a non-negative integer",
  "$comment": "A JSON Schema Compound Document. Aka a bundled schema.",
  "$defs": {
    "https://jsonschema.dev/schemas/mixins/integer": {
      "$schema": "https://json-schema.org/draft/2020-12/schema",
      "$id": "https://jsonschema.dev/schemas/mixins/integer",
      "description": "Must be an integer",
      "type": "integer"
    },
    "https://jsonschema.dev/schemas/mixins/non-negative": {
      "$schema": "https://json-schema.org/draft/2020-12/schema",
      "$id": "https://jsonschema.dev/schemas/mixins/non-negative",
      "description": "Not allowed to be negative",
      "minimum": 0
    },
    "nonNegativeInteger": {
      "allOf": [
        {
          "$ref": "/schemas/mixins/integer"
        },
        {
          "$ref": "/schemas/mixins/non-negative"
        }
      ]
    }
  },
  "$ref": "#/$defs/nonNegativeInteger"
}

When the bundled schema is initially loaded and evaluated, the implementation should create its own internal index of schema identifiers and schema resources, just as before. The relative URIs used to reference those schema resources need not change.

The simplest way to see this bundled schema working as expected is to paste it into https://json-schema.hyperjump.io and then try different values for the instance. I’m hopeful to bring several updates to https://jsonschema.dev over the next few months, but times are busy as we continue to elevate JSON Schema as an organisation.

It’s worth remembering that the example in this article shows the ideal situation, when best practices have been followed. The JSON Schema specification does define additional processes for non-ideal situations and edge cases (such as when $id or $schema are not set), however, some solutions may be indirectly related to Compound JSON Schema Documents. For example, establishing the base URI follows the steps laid out in RFC3986, which JSON Schema does not redefine.

OpenAPI Specification Example

Let’s look at an example of how this might work with an OpenAPI definition.

openapi: 3.1.0
info:
  title: API
  version: 1.0.0
components:
  schemas:
    non-negative-integer:
      $ref: 'https://jsonschema.dev/schemas/examples/non-negative-integer'

We start with our input OpenAPI 3.1.0 specification document. For brevity, we’re only showing the components section with a single component, but let’s assume some other part of the document uses the component schema “non-negative-integer”.

“non-negative-integer” has a single reference to a JSON Schema resource. The reference URI is an absolute URI, including domain and path, meaning there’s no need to do any “resolve the relative URI against the base URI” dance.

All the schemas required to resolve and bundle the reference are provided to the bundling tooling. After the schemas are loaded into the implementation, their originating physical location no longer matters.

openapi: 3.1.0
info:
  title: API
  version: 1.0.0
components:
  schemas:
    # This name has not changed, or been replaced, as it already existed and is likely to be referenced elsewhere
    non-negative-integer:
      # This Reference URI hasn't changed
      $ref: 'https://jsonschema.dev/schemas/examples/non-negative-integer'
    # The path name already existed. This key doesn't really matter. It could be anything. It's just for human readers. It could be an MD5!
    non-negative-integer-2:
      $schema: 'https://json-schema.org/draft/2020-12/schema'
      $id: 'https://jsonschema.dev/schemas/examples/non-negative-integer'
      description: Must be a non-negative integer
      $comment: A JSON Schema that uses multiple external references
      $defs:
        nonNegativeInteger:
          allOf:
          # These references remain unchanged because they rely on the base URI of this schema resource
          - $ref: /schemas/mixins/integer
          - $ref: /schemas/mixins/non-negative
      $ref: '#/$defs/nonNegativeInteger'
    integer:
      $schema: 'https://json-schema.org/draft/2020-12/schema'
      $id: 'https://jsonschema.dev/schemas/mixins/integer'
      description: Must be an integer
      type: integer
    non-negative:
      $schema: 'https://json-schema.org/draft/2020-12/schema'
      $id: 'https://jsonschema.dev/schemas/mixins/non-negative'
      description: Not allowed to be negative
      minimum: 0

The schemas are inserted into the components/schemas location of the OAS document. The keys used in the schemas object have no importance for reference resolution, although you will want to avoid potential duplications. References need not change, and a processor of the resulting bundled or Compound Document, should look for the use of embedded Schema Resources within the OAS document, keeping track of the $id values.

But what about…

The astute among you might have noticed that Compound Documents may not be correctly validated using a meta-schema for the dialect defined at the document root. One of our principal contributors distilled a great explanation which he has agreed to let us share with you.

If an embedded schema has a different $schema than the parent schema, then a Compound Schema Document can’t be validated against a meta-schema without deconstructing it into separate schema resources and applying the appropriate meta-schema to each. That doesn’t mean the Compound Schema Document is not usable without deconstruction, it just means that implementations need to be aware that the $schema can change during evaluation and handle such changes appropriately.” – Jason Desrosiers.

If you’d like a more in-depth look at edge case situations, please do let us know.

You can reach out to us @jsonschema or our Slack server.

I hope you’ll agree, Ben has clarified the process for us all here, and we can use this example to fully meet JSON Schema’s bundling expectations when writing tools which bundle multiple resources into compound OpenAPI documents. Thanks, Ben! – Mike

Business photo created by vanitjan – www.freepik.com

APIAddicts Joins OpenAPI Initiative

By Announcement, Blog

Welcome to APIAddicts! APIAddicts boasts one of the largest communities of API practitioners with an extensive international presence in Spanish-speaking countries including Spain, Perú and México with 5,000+ APIAddicts worldwide, and over a million visualizations. 

APIAddicts hosts multiple events throughout the year for developers wanting to discuss, learn more, and contribute to the growth and integration of APIs. Many details are provided in the interview below. APIAddicts Days is one of the most important events hosted by them annually. Companies and professionals share their experiences while analyzing the successes and failures in the current framework that supports API. To find out how APIAddicts Days was in 2020, click here

APIAddicts has a strong education focus. Their School of API’s aims at helping professionals improve their use of APIs. At the end of the courses, you are awarded an API Evangelist Certificate / Recognition / Diploma.

We wanted to learn more about their upcoming events and the latest updates in the API world, and what working with the OpenAPI Initiative would look like. We talked to Marco Antonio Sanz, founder of APIAddicts, to find out more!

How was APIAddicts born?

In 2013 five very techie friends and colleagues were working in large companies managing APIs and realized a lack of APIs documentation and education in Spanish at that moment. 

So they started giving a series of informal talks to share their knowledge. Meetup groups were created, which are still active today, to connect with other people interested in digital transformation and we started to meet monthly to talk about news or updates in the sector, actually the first meetings were in a bar in Getafe, until it was gradually formalized.

This was growing and finally formed what today is APIAddicts, the largest Spanish-speaking community around APIs with more than 5,000 APIAddicts in Spain, Chile, Peru, Mexico, Colombia and Argentina. 

How hard is it to get an API Evangelist Diploma? What are the main benefits?

The evangelization of APIs and the training of professionals in it is so important, so from the Foundation we wanted to go a step further and help companies to qualify their teams with the most updated knowledge and from the best experts. 

We had already been doing the API Owner course for years, which is in great popular demand today, and we decided to complete the training with the specialization courses in Security, Testing, Design and Architecture. This way you can get an overview of the entire APIfication process of a company and learn how to manage each of these phases optimally. 

All the courses involve a total training of 50 hours with 80% of hands-on practice where students can apply the knowledge they have learned. Once the courses are finished, they present an API project to the teachers, who evaluate and assess the documentation. 

If they achieve the API project taking into account the requirements and concepts of the courses, they obtain the certifications of each specialty and the API Evangelist Diploma, the highest certification for an API professional. 

Nowadays there are many companies that trust in the courses of the API School of the Foundation and with them they specialize their teams, PM, QA, Technical Lead, Analysts, Coordinators… Taking these courses guarantees you knowledge in the whole APIfication process and a clear management of the necessary tools. This not only increases your job projection but also your career within the companies.

What is APIAddicts mission in the evangelization of APIs?

We have always been aware that, although APIs are a key factor in companies that want to be competitive today, it is a term that generates a lot of confusion. Therefore, we could say that the APIAddicts Foundation’s evangelizing mission began in 2013 when APIs were already being used but no one fully understood what they were for. In fact, the role of our community has been essential to introduce the transformative potential of APIs in Spain.

In this sense, we carry out annual studies, Open Source tools, training courses, 3 large events: Webseries, Startups and APIs and APIAddicts Days and more than 20 annual meetups. We are the community that has created more API documentation in Spanish and now we hope to continue doing so with OpenAPI. 

We believe that if we had to define what our main differentiation is, it would be that we always try to have the best referents of the sector to inspire in the use of APIs and always from a close treatment, being a real community.  That is why, for example, all our API School courses are not recorded because for us this makes you lose an essential part of the API training: sharing experiences and real cases with the live expert. 

Finally, every day, and more and more strongly, we continue to do what we were born for: to bring the knowledge of APIs to every Spanish-speaking corner of the world, in a disruptive way.

What is the relationship between APIAddicts and CloudAPPI?

After starting the evangelization of APIs through talks and meetups, the community was gaining popularity and with it came business opportunities. This became a turning point in which they decided to leave their usual jobs to set up their own consulting firm, outside the bases of the association, because on one side runs the foundation and on the other the company, but they could accompany each other as partners in their growth, both approaching the life cycle of the APIs from different points of view.

Therefore, today the relationship between both entities is based on sponsorship, so they actively participate in different events. However, the Foundation supports itself with its own budget thanks to the other sponsors, courses of the API School and prices of some of the tickets for our events. 

We always had it clear, the Foundation was born altruistically by a group of teckie friends who love APIs to promote the knowledge of APIs in Spanish and will always remain so.

Why did APIAddicts join OpenAPI Initiative and why now?

In these last few years we have been trying to take API Evangelism further in all formats: Open Source tools, international events, new broadcasting formats such as podcasts, specialization courses… Within all of this, we saw the need to be part of a development community as essential for the future of APIs as the OpenAPI Initiative. 

This way we can offer added value to our community and increase the role of the Foundation in the future of APIs.  We want to help to improve the specification and thus improve the level of the tools that allow the implementation of API First methodologies, such as API Tools.


We enjoyed hearing from the APIAddicts team and learning about the fantastic work they are doing. We appreciate the value they place in community building and vocalizing the needs of the API community.

OpenAPI Resources

To learn more about participate in the evolution of the OpenAPI Specification: https://www.openapis.org/participate/how-to-contribute

About the OpenAPI Initiative

The OpenAPI Initiative (OAI) was created by a consortium of forward-looking industry experts who recognize the immense value of standardizing on how APIs are described. As an open governance structure under the Linux Foundation, the OAI is focused on creating, evolving and promoting a vendor neutral description format. The OpenAPI Specification was originally based on the Swagger Specification, donated by SmartBear Software. To get involved with the OpenAPI Initiative, please visit https://www.openapis.org

About Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation projects like Linux, Kubernetes, Node.js and more are considered critical to the development of the world’s most important infrastructure. Its development methodology leverages established best practices and addresses the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

OAI Additional Leadership Job Posting

By Announcement, Blog

The OpenAPI Initiative (OAI) is looking for three individuals to fill three separate year-long contracts to help lead the conversation around the leading API specification. We are looking for three self-starting individuals who are familiar with the OpenAPI Specification (OAS), and will help roll up their sleeves and help move a handful of projects forward.

The OpenAPI Specification (OAS) has emerged as the leading way to describe the surface area of your APIs in the last five years, and the need for help to support the community has grown dramatically. As part of this growth we’ve identified a few priority projects we are managing using Github Projects, and would love to get you involved!

Projects

Here is a list of the current projects OAI members and community have prioritized, which we have also tagged by the type of work involved, shared a brief description of what is involved, and provided a link to the Kanban board for each project.

  • OpenAPI Search Engine Optimization (Marketing) – Ongoing work to help improve the search engine optimization for the OAI and OAS, helping elevate the presence of the specification.
  • New Member Identification & Engagement (Marketing, Community, Business Development) – Work to identify potential new members to the OAI and reach out to them to help start a conversation and make them aware of the benefits of being a member.
  • Member Showcase & Engagement (Marketing, Community, Business Development) – Continually improving how we showcase and engage with the members of the OAI, and increase their participation within the OAS community.
  • Profile & Engage with API Providers (Community, Business Development) – Work to identify, profile, and build relationships with API providers who have implemented the OpenAPI specification, and publish profile to a central database.
  • Profile and Engage with Open Source Tooling (Marketing, Community, Business Development) – Establish an official directory of open source tooling that uses OAS, and actively work to establish and build relationships with each tooling provider to get them more involved in the community.
  • Curate and Publish API Articles, Podcasts, Videos (Marketing) – Work to discover, curate, and then showcase and syndicate the existing articles, podcasts, and videos that exist about OpenAPI.
  • Business Sector Showcase & Engagement (Business Development) – Work to profile business sectors that are putting OpenAPI to work, then engage, and build relationships with individuals or organizations, while helping stimulate OAI special interest groups in these areas.

All of these positions require leadership qualities to help define, prioritize, and execute on plans, and to engage with the OpenAPI community to increase participation, member value, and drive overall community energy.pods

Contract Details

These are the details of the work, with more details available upon request:

  • 3 Individual Part-Time Contractors
  • One Year Contract from September 15th 2021 through Sept 15th, 2022)
  • Open to Global & Remote Contract Works
  • Fixed Price Contracts ($2,500.00 Monthly / $30,000.00 Annual)
  • Monthly Report Demonstrating the Value of Work

Requirements

These are the basic requirements for being considered for the contract positions:

  • OpenAPI Awareness
  • Project Management Skills
  • Writing and Editing Skills
  • Enjoys Working with People
  • Embraces values of collaboration, inclusion, and diversity

Contract Pitch

When applying for one of the contracts, please pitch us on which project or projects your skills are a match for accomplishing, and demonstrate why you would be our choice to help not just make a project happen, but help us lead where the project goes next.

How to Apply

Please email your cover letter, resume, and any questions to jobs@openapis.org by August 15th, 2021 to be considered for the positions by the OAI hiring committee, with a start date of September 15th, 2021.

Get Involved

Even if you aren’t interested in submitting for any of these job postings we encourage you to get involved in the community and help out on any of these projects. In addition to participating as part of the weekly technical and marketing meetings, the OAI is looking for help across these projects, writing blog posts, and creation of educational materials, so please feel free to reach out via social channels or email to get involved.

An OpenAPI Interview with Marc-André Giroux of Github

By Blog

By Kin Lane, Chief Evangelist at Postman, and Co-Chair of the OpenAPI Initiative (OAI) Business Governance Board (BGB)

I recently sat down with Marc-André Giroux, Staff Developer at GitHub about their OpenAPI journey, to learn more about how the social coding platform employs OpenAPI across the API lifecycle. GitHub plays a central role in the development of software of any kind and is also central to the entire API lifecycle, but the platform’s APIs provide a wealth of learnings when it comes to how to deliver API infrastructure scale. So we are honored to have sat down for this discussion about the OpenAPI journey GitHub is on, and share this vision with the OAS community.

When did you begin your journey using the OpenAPI specification?

GitHub began looking into OpenAPI in 2019. Funnily enough, two teams started to look at it independently, which highlights how OpenAPI brings many different benefits to the table. On one end, our documentation team wanted a machine-readable document from which to generate their documentation. On the other hand, the API team (the team I am on) wanted to use OpenAPI as something core to our API development experience. Things like having a design-first flow, request validation, contract testing, linting, etc.

Our documentation team started exploring with OpenAPI first. They ingeniously generated an initial document from our hand-written documentation. This was a great start, but from our API team’s perspective, since it was generated from something that we could not entirely trust, it was important for us that this OpenAPI description was being tested against, and used by our runtime code to make sure it is both accurate and complete.

Over the first half of 2020, we worked with the documentation team to use their initial work, bring it over to the API’s codebase and use it in a validation middleware that would tell us about any mismatches. We built tooling to automatically fix errors, fixed a lot of errors by hand, and in July 2020, we were ready enough to release the beta of our OpenAPI description.

What teams are currently using OpenAPI within your organization?

The surface area of our API  is very wide. Almost every team working on a feature at GitHub will end up writing an API for it. This means that in the end, a ton of different teams interact with OpenAPI. As the API team, we consider these teams to be our customers as well. We make sure their experience with OpenAPI is great and that we have the right tooling to help them build great APIs. Our documentation team also uses OpenAPI to generate most of our API documentation website.

Is OpenAPI used for public APIs? Internal APIs? Or Both?

OpenAPI is currently only used for our public API. We have a lot of internal APIs that we would eventually like to describe with OpenAPI. However, we’ve recently made the switch to using Twirp for most internal use cases, which is described by Protobufs.


Can you share more about how OpenAPI is being used across different areas?

Documentation:

The OpenAPI written by our developers gets automatically synced with our documentation repository. Our documentation team built an amazing pipeline which decorates the OpenAPI further and is used to render our API documentation.

Code Generation

We also sync the OpenAPI description in our open-source repository. This is used by our first use case of code generation, the Octokit JS SDK. Gregor Martynus did an amazing job using the OpenAPI we publish to generate typescript types and power the javascript SDK.

Soon, we want to help our own developers with code generation as well. We already leverage OpenAPI operations at runtime to configure certain resources, but we want to push it further and help generate implementations from OpenAPI operations.

Testing

GitHub’s API is implemented in an enormous Ruby monolith, with thousands and thousands of existing tests. We leveraged this by including a custom OpenAPI validation middleware in our existing integration testing suite for the API. This contract testing is integral to our confidence in the accuracy of our OpenAPI documents and was one of the hardest, and most important, parts of our OpenAPI work.

A lot of existing OpenAPI validators return errors that are not necessarily friendly to users. We spent a lot of time coming up with an abstraction that lets us return errors that are either developer-facing or end-user-facing. For example, the OpenAPI validation errors that are returned as part of our integration tests return error messages that instruct the developer on why the error happened and how to fix it. On the request validation side of things, we instead instruct the user on why their input was invalid, and how to correct it.

What were the driving factors for you to adopt OpenAPI?

While generating documentation was an amazing benefit, OpenAPI was more than just that to us. Enabling powerful tooling and building a great developer experience were the main reasons we looked at OpenAPI.

Do you make a copy of your OpenAPI available for download by users? If so, where do you make it available?

We do! As mentioned earlier, we have it available as an open-source repository located here.

Do you currently have any OpenAPI extensions you are using or have established as part of your operations?

We use quite a few extensions that are useful at different stages of the “lifecycle” of our operations. We support many different versions of GitHub’s REST API. Our classic public API, but also every version of GitHub enterprise. While the APIs are similar, they do have differences across versions. Instead of duplicating every single operation and components, we opted for a build system that would allow us to annotate OpenAPI objects with versioning information. For example we have a `x-github-releases` extension that allows developers to include or exclude operations from certain releases. This extension is not present in our published descriptions since it is stripped by that build system.

We have other public extensions that add some GitHub-specific metadata to some OpenAPI objects. For example, we have an extension called `x-github-previews` which informs whether this operation is part of an API preview.

Other extensions are described in our open-source repository.

Is your team joining the technical developer community conversations to evolve the OpenAPI specification?

Not at the moment, but the recent overlay specification talks are of great interest to us. We will most likely join these conversations very soon.

Do you use other API specifications (ie. AsyncAPI, GraphQL, gRPC)? If so, can you share details of why?

We do! We use GraphQL as another public API which offers a different experience to GitHub’s domain and more flexible use-cases to our customers (more about this here). As mentioned earlier, we also use Protobuf for our internal Twirp APIs.

How did you convince leadership in your organization that OpenAPI was an important investment?

Fortunately, leadership was already onboard with using a machine-readable document to describe our APIs. We had extensive usage of JSON hyper-schema already, but the fast adoption of OpenAPI in the community and how it would allow us to describe our APIs accurately made us choose OpenAPI in the end.

In Conclusion

GitHub’s adoption of OpenAPI follows similar trends we are seeing with other leading API providers like Salesforce, Twitter, Plaid, and others. Multiple teams are realizing the benefits of using OpenAPI, with documentation being the number one driver, but then code generation and testing being second and third. API providers who are further along in their journey have realized that OAS doesn’t always do 100% of what they need and, like GitHub, are extending the specification and increasingly moving towards participation in the weekly technical developer community calls to help evolve the specification. It is gratifying to see leading providers like Github recognize and start to leverage the benefits of the OpenAPI Specification (OAS) for their business, and we look forward to having them more involved in the community.
If you’d like to share your OpenAPI journey story we’d love to hear from you. While many of the details between Plaid, GitHub, and other API provider stories are similar, it helps to tell stories from multiple perspectives and business sectors. If you have an OpenAPI story you’d like to share, feel free to submit a GitHub issue with your ideas, and we’ll consider adding it to the editorial queue. We wanted to thank GitHub, and Marc-André Giroux for letting us interview them, and share this great story with the OpenAPI community. Thanks to GitHub for providing us all with another example of why GitHub is a leader when it comes to how technology is delivered in the 21st century.

OpenAPI Initiative Welcomes RapidAPI

By Announcement, Blog

RapidAPI is joining the OpenAPI Initiative! RapidAPI helps developers find, manage, and test APIs. They provide an API Marketplace for developers to discover and connect to thousands of public APIs. The company’s services can be tailored to a company’s brand, to build private marketplaces for internal and external APIs. Notable enterprise clients include SAP, Cisco, Tata, and Hyatt. 

To learn more about RapidAPI Marketplace, sign up for a free account here.

To better understand how RapidAPI provides a next-generation infrastructure for APIs and what a collaboration with the OpenAPI Initiative would mean for them, we asked Iddo Gino, CEO and Founder.

Why did RapidAPI join the OpenAPI Initiative, and why now?

APIs are the heart and soul of RapidAPI. With acquisitions like the Paw Client and unique developer tools such as RapidAPI Testing — we care deeply about the developer experience of writing, publishing, and ultimately maintaining APIs. OpenAPI plays a significant role in all of this.

Our goal in joining the OpenAPI Initiative is to give back to the community. As we build the next-generation of API Developer Tooling, it’s more important than ever to be part of the API builders’ community conversations. OAI is where API builders and API consumers go beyond the original spec to collaborate on exciting things such as JSON Schema and Asynchronous APIs and beyond. RapidAPI wants to help advance the specification through participation in the community.

Additionally, we’re committed to making our platform easily interface with 3rd party tools. Therefore, it’s important for us to continually support the OpenAPI Specification format, allow interconnectivity of all tools, and avoid customer lock-in.

The RapidAPI team has a history of participation in development communities. Our Head of DevRel Ahmad Awais is an active voting member of the Node.js Community Committee and WordPress Foundation Core Contributor before that. Our Head of Marketing, Suzanne Panoplos, has been active in the Open Container Initiative and CNCF. As you can see, joining the OpenAPI Initiative and becoming a member of the Linux Foundation couldn’t be a more natural transition for RapidAPI. 

What goes into making an API marketplace? What makes one better than the other?

A robust API marketplace enables developers to find, connect and manage APIs from a single place and support a variety of APIs including REST, SOAP, GraphQL or Asynchronous APIs. For each API you should be able to see performance metrics like average latency, uptime, and popularity and be able to test the API from the browser.

Additionally, a marketplace should enable you to:

  • Gather information about each API using the endpoints page to view a list of endpoints, documentation, and a code snippet to help you implement the code into your app.
  • Subscribe to API plans to start using it. Manage all your API subscriptions and payments through a single source.
  • Utilize a single key for all your APIs.
  • Manage applications and API keys using a single dashboard. Using this dashboard, you should be able to:
    • Monitor API performance by visualizing how many requests are made to different APIs, tracking the number of requests that return an error, and viewing latency data for each API.
    • Debug faster by inspecting the logs for all requested data.
    • View usage and billing information for a breakdown of API spending, including the monthly recurring and overage charges.
    • Manage subscriptions from one place including quota usage and time remaining until the quota limit resets.

How has COVID affected the use of APIs in the past year?

With the pandemic, digitalization has gone from being a “nice to have” to an imperative to survival. Take Starbucks as an example. Before the pandemic, 7% of Starbucks revenue came from mobile ordering. During the pandemic, 100% of their revenue came from mobile ordering as most stores were closed to in-person ordering. 

To react to the changing environment during the pandemic, companies had to adjust their application development process and accelerate their delivery to address changing market conditions. To do so, companies have to rely on APIs in their software development. 

This trend is true across almost all industries during the pandemic. Still, there has been an explosion of API usage in healthcare settings, where services, appointments, and scheduling have taken place online to the homebound consumption of services such as food ordering, retail, etc. 

This trend was highlighted in RapidAPI’s 2020 Developer Survey, released earlier this year. The survey indicated that developer reliance on APIs accelerated during the pandemic and will continue to increase in 2021.The data revealed that 61% of developers used more APIs in 2020 than in 2019, and 71% plan to use even more during the upcoming year. 

What’s your vision for API marketplaces 1-3 years out?

As API proliferation continues to increase across all organizations, enterprises will find it to be a necessity for finding, connecting to and managing their APIs. Marketplaces will need to provide:

  • An Open Platform – With organizations using disparate API gateways, marketplaces will offer a platform for integrating with multiple different gateways and clouds. 
  • Developer Experience – Marketplaces will need to be continually developer-first, providing the key features needed to deliver API use and reuse such as deep search and tagging API analytics for providers or consumers, and advanced usage controls for developers.
  • Out of the box experience – Marketplaces will need to have everything developers need to succeed including advanced features such deep search, end user analytics, and developer registration workflows.
  • Many-to-Many Model – Marketplaces will need to support multiple teams of API providers offering APIs to internal consumers as well as partners and customers.
  • Support for All API Types – Marketplaces should support all of the last APIs including SOAP, REST, GraphQL, Async APIs like Kafka, and more.
  • Scalability – Marketplaces should scale to support hundreds of APIs with room for future growth.
  • Governance – Marketplaces will unify visibility and governance across all APIs in the organization, regardless of which clouds or API gateways are in use. 

We are excited to welcome RapidAPI to our list of growing members and look forward to working closely together!

OpenAPI Resources

To learn more about participate in the evolution of the OpenAPI Specification: https://www.openapis.org/participate/how-to-contribute

About the OpenAPI Initiative

The OpenAPI Initiative (OAI) was created by a consortium of forward-looking industry experts who recognize the immense value of standardizing on how APIs are described. As an open governance structure under the Linux Foundation, the OAI is focused on creating, evolving and promoting a vendor neutral description format. The OpenAPI Specification was originally based on the Swagger Specification, donated by SmartBear Software. To get involved with the OpenAPI Initiative, please visit https://www.openapis.org

About Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation projects like Linux, Kubernetes, Node.js and more are considered critical to the development of the world’s most important infrastructure. Its development methodology leverages established best practices and addresses the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.