mmmmmm licorice strings
hey so I like programming! I do it for money sometimes. I'm gonna talk about an interesting problem and its solution that I observed today. if you don't enjoy code for the sake of code, you can skip this one! or, if you already know what TypeScript is, you can skip ahead to the problem.
context #
so there's this software I use all the time called TypeScript, which is like JavaScript but with types. would you like a contrived example to show why I use it? I bet you would!
here's a snippet of JavaScript code I might write to generate some repetitive text, like the 99 bottles of beer song.
let n = '99';
while (n > 0) {
console.log(n + 'bottles of beer on the wall');
n = n - 1;
}
counting down like this will work just fine. But if I do the opposite:
let n = '1';
while (n <= 12) {
console.log('on the ' + n + 'th day of christmas...');
n = n + 1;
}
this won't work correctly! It'll do something like this:
on the 1th day of christmas... on the 11th day of christmas... on the 111th day of christmas...
that's because I've mistakenly defined n
as '1'
, in quotes, which means it's actually a string, and not a number. when you do math on things that aren't numbers, sometimes it works how you expect, but not always. it would be nice if I had a way to scan my source code and say "whoops, did I ever treat the same thing as a string and a number without meaning to do that?" enter TypeScript:
let n = '1';
while (n <= 12) {
// TYPE ERROR: Operator '<=' cannot be applied to types 'string' and 'number'.
TypeScript sees that I messed up, and it says "hey, that's not something you would normally do with the less-than-or-equal operator. check your types."
you can also tell typescript intentionally 'hey, I am expecting this to be a type' and it'll throw the error even earlier:
let n: number = '1';
// Type 'string' is not assignable to type 'number'.
the problem #
okay so typescript is too clever for its own good. if two types look the same, they are the same. so I can tell typescript "hey, assign this string to that string" and it'll say "sure" even if those strings have differently named types.
type Name = string;
type Color = string;
const name1: Name = 'Alice';
const color1: Color = name1; // no error?
most of the time this is fine. if it fits, it fits. but what if I want this to throw an error? one solution is "flavoring", as given in this excellent 2018 article by Drew Colthorp
basically, I add a "fake property" to the Name
and Color
types, and then TypeScript will know they're not interchangeable.
type Name = string & { type: 'Name' };
type Color = string & { type: 'Color' };
const name1: Name = 'Alice';
// Type 'string' is not assignable to type 'Name'.
...uh. TypeScript doesn't like that anymore; it knows that 'Alice'
can't possibly be a Name
because it doesn't have that extra field. And we can't actually add that data because strings are a primitive type, they don't have fields, this whole thing is just a fiction to make TypeScript understand that names and colors are different kinds of strings.
that's fine though! we'll make the field optional so that TypeScript doesn't care whether we use it or not. and while we're at it, we'll make it readonly so TypeScript doesn't think we can write to it.
type Name = string & { readonly type?: 'Name' };
type Color = string & { readonly type?: 'Color' };
const name1: Name = 'Alice';
const color1: Color = name1;
// Type 'Name' is not assignable to type 'Color'.
bingo! just what we wanted. now if we accidentally mix-up these types later, we'll get a nice friendly error message about it.
if we want to use these types a lot, we might make a type pattern to describe it:
type FlavoredString<F> = string & { readonly type?: F };
type Name = FlavoredString<'Name'>;
type Color = FlavoredString<'Color'>;
we could even make the pattern fully generic, but then we have to handle the case that type
already exists on the thing we're adding flavor to. thankfully, the standard library provides the Symbol type. a Symbol is just like a string except that it's guaranteed to be unique, meaning it won't be any of the existing properties on our base object. we don't even have to make a real symbol, we can just tell typescript "imagine a symbol".
declare const flavor: unique symbol;
type Flavored<T, F> = T & { readonly [flavor]?: F };
type Name = Flavored<string, 'Name'>;
type Color = Flavored<string, 'Color'>;
const name1: Name = 'Alice';
const color1: Color = name1;
// Type 'Name' is not assignable to type 'Color'.
remember, the [flavor]
property doesn't really exist, it's a fiction we made up for TypeScript to catch our errors. in the output code, name1
will still just be a string.
and there you go! now you can be extra pedantic about all those database ids being thrown around in your code. they're not really all the same type, right? you wouldn't assign an id string to a display string, right?