Data Model
Factual Data
Factual or "core" metadata includes identifiers and names of the Artists, Albums, Recordings, and Songs. The Album object also includes number of tracks and an ordinal.
Artist
Artist enumerates the unique Artists contained in the music data.
| Item | Data Type | Description | Example |
|---|---|---|---|
| Alternate Name | string | The alternate name includes all artist name changes an Alias for the Artist. | Alternate name for Snoop Dogg is Snoop Doggy Dogg. |
| Artist Name | string | The name of the artist. | David Bowie |
| Key Artist | string | A Key Artist may have released material under more than one ensemble name, yet all of these are considered part of the Key Artist's body of work. | Group "The Miles Davis Quartet" has Key Artist: Miles Davis. |
| Similar Artists | string | An editorially assigned field that associates artists that are significantly stylistically similar to one other. | Taylor Swift has similar Artists Naomi Scott, Cher Lloyd, Union J, Nick Jonas, Pia Mia, Tory Kelly etc. |
| Artist Type | string | Describes whether an Artist is an individual, group, or a collaboration. | Person: Snoop Dogg Group: The Beatles Collaborations: Rihanna Feat. Drake |
Descriptor
Descriptor associates descriptors to Artists, and provides a weight for each association. Descriptors included for an artist are Genre, Language, Origin, Era and Artist type.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Descriptor ID | string | The unique identifier for the Genre, Era, Origin and Artist Types. | 2932 |
| Weight | integer | The weighted value of the descriptor. | 75 |
Images
Images associates images to Artist, and stores information about individual images.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Size | string | Size of the Artist image. | "XLARGE" |
| Height | integer | Height of the Artist image | 1080 |
| Width | integer | Width of the Artist image | 1920 |
| URL | string | URL of the Artist image | "https://example.com/image.jpg" |
External IDs
External IDs provide mapping between customer's external artist identifiers to associated Gracenote artist IDs.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Source | string | The customer name. | www-music-party-id |
| Value | string | The external identifier value. | 12345 |
Band Members
Band Members associates Artists of Type "Group" with the individual Members of the group.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Member Artist ID | string | The unique identifier for the member Artist. | GN00KQ1T9WV0FE2 |
| Member Name | string | The name of the member Artist. | Artist Group: The Beatles Band Members: Stuart Sutcliffe, Ringo Starr, Paul McCartnery, John Lennon, George Harrison |
Member of Bands
Member of Bands associates Bands by an artist to the Artist member.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Band Artist ID | string | The unique identifier for Artist collaboration. | GN00KQ1T9WV0FE2 |
| Band Name | string | The name of the band. | The Beatles |
Collaboration Members
Collaboration Members associates Collaborations by an artist to the Artist.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Member Artist ID | string | The unique identifier for Member Artist. | GN00KQ1T9WV0FE2 |
| Member Name | string | The name of the collaboration member. | Drake |
Member of Collaborations
Member of Collaborations lists all collaborations for an artist.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Collaboration ID | string | The unique identifier for the Collaboration. | GN00KQ1T9WV0FE2 |
| Collaboration Name | string | The name of the collaboration. | Rihanna Feat. Drake |
Has Key Artist
Has Key Artist associates Key Artist to Artist.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Key Artist ID | string | The unique identifier for the Artist | GN00KQ1T9WV0FE2 |
| Key Artist Name | string | The name of the key artist | Miles Davis |
Album Master
Album Master enumerates all the albums identified as 'Master' are contained in the Music Data.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Album Master ID | string | The Gracenote unique identifier for the master album. | GNEH7NS4W15VQN9 |
| Album Title | string | The title of the album. | Abbey Road |
| Release Year | integer | The year the album was released. | 2008 |
| Track Count | integer | Number of tracks on the album. | 12 |
Descriptor
Descriptor associates descriptor to Album, and provides a weight for each association. The descriptor included at an Album level is Genre.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Genre ID | string | The unique identifier for the album genre | 2851 |
| Weight | integer | The weighted value of the genre. | 85 |
Release Type
Release Type provides information about the type of release.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Release Type ID | string | The Gracenote identifier for the type of release. | 1,2,3,4, or 10. |
| Release Type Name | string | The name of the release type. | Original, Compilation, Single |
Album Editions
Album Editions enumerates all album releases by Artists.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Album Edition ID | string | The Gracenote unique identifier for the album edition. | GN7YJ63ENM30AST |
| Edition Title | string | The title of the album edition. | Abbey Road (Remastered) |
| Release Year | integer | The year the album was released. | 1995 |
Track
Track associates tracks to Album.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Track Number | integer | The sequential track number | 1,2,3... |
| Track Title | string | The track title. | Made In Heaven, If You Had My Love |
| Duration | integer | Track duration in milliseconds. | 241684 |
Artist
Artist associates artist to Album.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Artist ID | string | The unique identifier for the artist. | GN3C3DKB2VEZFNM |
| Artist Name | string | The name of the artist. | The Beatles |
Images
Images associates images to Album, and stores information about individual cover art.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Size | string | Size of the artist image. | XLARGE |
| Height | integer | Height of the artist image. | 1080 |
| Width | integer | Width of the artist image. | 1920 |
| URL | string | URL of the cover art image. | "https://example.com/cover.jpg" |
Recording
Recording enumerates the unique Recordings contained in the data export.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Recording ID | string | The Gracenote unique identifier for the recording. | GNB4YN4J7K4WKZ6 |
| Recording Title | string | The title of the recording. | Clémence en vacances |
| Duration | integer | Recording duration in milliseconds. | 241684 |
| ISRC | string | International Standard Recording Code. | DEBL60376500 |
Descriptors
The descriptors object associates descriptors to Recordings, and provides a weight for each association. Descriptors included for an artist are Genre, Origin, Era, Artist type, Mood and Tempo.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Descriptor ID | string | The unique identifier for the Genre, Era, Origin, Artist Type, Language, Mood and Tempo descriptor values assigned to the Recording. | 2935, 2342 |
| Weight | integer | The weighted value of the descriptor. | 75 |
External IDs
externalIDs provides mapping between customer's external track identifiers to associated Gracenote album IDs.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Source | string | The source of the external ID. | trackIDs |
| Value | string | The external identifier value. | 13312762 |
Song
Song associates different Recordings of the same Composition performed by the same Artist.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Song ID | string | The unique identifier of the song. | GN0S4HETSZXRHAW |
| Song Title | string | The title of the song. | Yesterday |
Artist
Artist associates song to Artist.
| Field | Data Type | Description | Example Values |
|---|---|---|---|
| Artist ID | string | The unique identifier for the Artist. | GN3C3DKB2VEZFNM |
| Artist Name | string | The name of the artist. | The Beatles |
Descriptors
A Descriptor is a weighted descriptive attribute of an Artist or Recording. Descriptors assigned directly by Gracenote editorial staff are:
- Language
- Artist Type
- Era
- Genre
- Origin
Other descriptors are Sonic Descriptors, created using machine algorithms that are trained using expert editorial knowledge. These are:
- Style
- Tempo
- Mood
There can be up to 10 different descriptors of the same type assigned to a single Artist, Album, or Recording instance.
Most instances will have:
- 1 to 10 Genres, Eras, Styles, and Moods each
- 1 to 3 Origins and Tempos each
- 1 Language
- 1 Artist Type
| Descriptor | Number per Instance | Definition |
|---|---|---|
| Artist Type | 1 | ArtistType indicates performance information about the Artist: Artist Grouping (Solo, Duo, Group) Gender Composition (Male, Female, Mixed) Instrumentation (Vocal, Instrumental) Vocal Grouping (Solo, Duo Group) Vocal Gender Composition (Male, Female, Mixed) |
| Genre | 1-10 | Genre describes the musical style of the Artist, or Recording. Each Artist, Album and Recording may have one or more Genres assigned, each with a weight associated with it, indicating its relative relevance for that entity. Gracenote's international team of editors select from a controlled vocabulary of over 2,700 unique Genre Descriptors Values in making these assignments. The combination of genre Descriptor Values and their weights that are assigned to an Artist or Recording are referred to as their "Genre Profile". A specific Recording's primary Genre may be different from that of its recording Artist. |
| Style | 1-10 | For each Recording, up to ten Styles detected are included in the data export. Each one is assigned a weight indicating how much out of a total of 100 points is attributable to that Style for the recording. The "Primary Style" is the most prominent Style detected across the duration of the Recording. This determination is made according to the relative strengths of each Style detected through supervised machine listening and heuristics. Also see Sonic Style Descriptors. |
| Origin | 1-3 | Origin indicates the geographic region(s) with which the recording Artist is most strongly associated. Note that this is not necessarily their birthplace or country, but rather the music scene with which they are most strongly linked. Most Artists have just a single origin assigned, though some do have more than one. There are over 600 Origin Descriptor Values in the controlled vocabulary used by Gracenote's editors. |
Descriptor Weights
For Language, Genre, Origin, Era, Artist Type, and Tempo, the weights of all descriptors of the same type assigned to a single Recording or Artist entity will add up to approximately 100.
For Sonic Mood, the sum of the Descriptor weights may sometimes add up to less than 100, due to the full Sonic Mood vector extending beyond the 10 most relevant positions published in the Data Export.
Note: For a given entity, the descriptor with highest weight for each Descriptor Type is the "primary descriptor" for that type.
Refer to the full JSON schema to see its objects and properties: gracenote_descriptors.json_schema
Correlates
Correlates are provided in a separate file by Gracenote. These enable computation of Artist and Recording similarity. Correlates are used to compare Descriptor Values to each other. A correlate is a numerical value between 1,000 and -1,000. Each correlate quantifies the expected relative similarity (or dissimilarity) of content assigned to any two Descriptor Values of the same Descriptor Type. The correlates file includes a single correlate comparing each Descriptor Value to each other Descriptor Value of the same Type (as well as to itself). That is, there is one correlate for each possible pair.
Descriptors Object
The Descriptors object provides a weighted attribute of an Artist or Recording.
Schema
Code
Members
| Object | Description |
|---|---|
| styles | Array of musical styles. |
| genres | Array of musical genres. |
| languages | Array of languages. |
| tempos | Array of perceived tempos determined by machine algorithms. |
| origins | Array of geographic origins. |
| eras | Array of time periods. |
| artistTypes | Array of artist type classifications. |
| moods | Array of moods determined by machine algorithms. |
Each of the above arrays follows this pattern.
Code
Object Definitions
| Object | Description |
|---|---|
| ID | The unique identifier for the descriptor. |
| weight | The weighted value of the descriptor. |
Example
Code
Descriptor Hierarchies
Descriptor Hierarchies map Descriptor Values (numbers) to human-readable Display Categories. They are not used to calculate artist or recording similarity, but rather to help categorize and organize descriptors for UI presentation.
Descriptors are mapped to linked parent/child Display Category nodes in each hierarchy. All hierarchies are simple hierarchies. That is, each child node maps to one, and only one, parent node. The hierarchies contain from two to four Display Category levels above the source Descriptor Value level, depending on the individual hierarchy.
There is one Descriptor Display Hierarchy for each of the following Descriptor Types:
- Origin
- Era
- Artist Type
- Tempo
For Genre, there are a total of 16 Hierarchies. Gracenote provides regionally-appropriate genre hierarchies for each of eight international regions. For each of these regions, there is a Simplified and a Detailed version. All Display Category hierarchy nodes are localized and translated into over thirty languages.
All source data at the Descriptor Value level uses the same Controlled Vocabulary. That is, there is only a single set of Descriptor Values used by Gracenote editors and machine listening systems worldwide. All annotations are made using this single controlled vocabulary to ensure semantic integrity of all descriptive data globally.
Refer to the full JSON schema to see its objects and properties: gracenote_hierarchy.json_schema.
Hierarchies
Schema
Code
Members
| Object | Description |
|---|---|
| hierarchyID | The unique identifier for the hierarchy. |
| hierarchyName | The name of the hierarchy. |
| nodes | Array of hierarchy nodes. |
Example
Code
Nodes object
The nodes object contains information for the lowest level display category.
Schema
Code
Members
| Object | Description |
|---|---|
| nodeID | The unique identifier for the node. |
| nodeName | The descriptor name. |
| level | The hierarchy level. |
| parentNodeID | The parent node identifier. |
| localizedNames | Localized Names object. See definition below. |
| mappedDescriptors | Array of mapped descriptor IDs. |
Example
The example below shows the default Era descriptors for the Node "1960s".
Code
Localized Names object
This object contains the language and script ID for a name.
Schema
Code
Members
| Object | Description |
|---|---|
| languageContext | Language context object. |
| name | The localized name. |
Example
Code
},
Language Context object
The Language Context object provides information about the language of a Hierarchy Object.
Schema
Code
Members
| Object | Description |
|---|---|
| language_id | The ID number for the language. |
| script_id | The ID number for the script. |
Example
Code
Sonic Style Descriptors
GMD supports descriptors that indicate the sonic styles of a Recording. Sonic Style describes musical style Objects detected in a Recording using machine learning, based purely on nature of the recording audio itself. These styles accurately characterize and differentiate individual recordings in an unbiased manner, regardless of expectations about the associated artist, which may not always apply. Sonic Styles are different from Gracenote Genres, which describe the musical and cultural context of an artist and are assigned editorially by Gracenote.
Gracenote Sonic Style offers a fresh take on music search and discovery. Sonic Style creates detailed, consistent and relevant descriptive profiles at the recording level, providing a granular view of musical styles across an artist's entire repertoire, as well as entire music catalogs. By distinguishing Sonic Style at the recording level, streaming music providers can better select the most relevant, engaging and personalized tracks for each listener, spanning multiple eras, based on their unique preferences. Further, by using Sonic Style, record labels and music publishers can gain a deeper understanding of which underlying music styles are driving listening trends around the world.
Navigating and Displaying Sonic Styles
To navigate and display Sonic Styles:
- Extract the Sonic Style IDs and weights listed in the Recording JSON output.
- For each ID, search the Descriptors JSON output. Extract the style names.
- Search mappedDescriptorIDs in the Hierarchy JSON output for Sonic Style IDs from step 1.
- Extract the hierarchy levels and corresponding names.
- Display the style names, hierarchy levels, and weights.
Sonic Style Examples
Examples below are based on a Recording of "Can't Buy Me Love" by The Beatles: https://gmd.gracenote.com/recording/GNEB50GY0F5GCZ6
Recording JSON Output
Code
Note: If there are no styles for a recording, the style descriptor block value is assigned as null. For example: "styles": null
Sonic Styles in the Descriptor JSON Output
Below is a snippet from the Descriptor file showing the style IDs and their corresponding names.
Note: Descriptor names are not localized.
Code
Sonic Styles in the Hierarchy JSON Output
The example below is a snippet from the Hierarchy file for Sonic Style Descriptor "British Invasion" (Style ID: 84206). For brevity, the localizedNames array is truncated in this example.
Note: Hierarchy names are localized.
Level 1 Hierarchy
Code
Level 2 Hierarchy
Code