You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If tokens are meant to be the source of truth, then defining tokens directly in a collection introduces fragmentation by allowing multiple places for tokens to be defined.
If I were to fetch a list of all tokens in a DSP, not only do I have to look for Token entities ({ type: 'token' }; which makes sense), but I also have to be aware of tokens defined in Collection entities ({ type: 'collection', tokens: [ ... ] }).
What happens if I define a token alias inside of a Collection?
Does the token alias another collection token?
Does it alias a Token entity (external to collection)?
Does it alias a Token defined in a different Collection?
What happens if I define a Collection token with the same id as a Token entity?
Fragmented Data Structure
In addition to above, it's also unclear how a DSP consumer should normalize the definition of a token, because a Token entity and a Collection token have different data structures.
Property
Token entity
Collection token
class
x
implied
type
x
x
id
x
x
name
x
value
x
x
last_updated
x
last_update_author
x
description
x
key
x
How does the key property differ from id in a Collection token?
How do I know when a specific Collection token was last updated?
You can tell when the Collection was last updated, but not individual tokens within it.
How do I know who last updated a Collection token?
You can tell who last updated the Collection, but not individual tokens within it.
Proposed Solution
What if we were to update Collection entities to act as a logical grouping of Token references, instead of a physical grouping of Token values?
Token entities would serve as the single source of truth for Token definition
Collections then reference Token entities via an "entity_refs" property
NOTE: data structure needs clarification (see "Unanswered Questions" below)
Fragmented Definition
If tokens are meant to be the source of truth, then defining tokens directly in a collection introduces fragmentation by allowing multiple places for tokens to be defined.
If I were to fetch a list of all tokens in a DSP, not only do I have to look for Token entities (
{ type: 'token' }
; which makes sense), but I also have to be aware of tokens defined in Collection entities ({ type: 'collection', tokens: [ ... ] }
).id
as a Token entity?Fragmented Data Structure
In addition to above, it's also unclear how a DSP consumer should normalize the definition of a token, because a Token entity and a Collection token have different data structures.
key
property differ fromid
in a Collection token?Proposed Solution
What if we were to update Collection entities to act as a logical grouping of Token references, instead of a physical grouping of Token values?
example:
Unanswered Questions
"{Entity-ID}"
strings"Entity-ID"
strings{ "<alias>": "<token-entity-id>" }
object format?The text was updated successfully, but these errors were encountered: