Javascript Custom Data Types

Back in the days of Borland’s Turbo Pascal, in what now seems like the Jurassic Period of programming, it was possible to create custom data types which behaved the way programmers wanted.

For example, along with the common String, Integer, LongInt etc, one could easily define a type called “Month” that, when increased past 12 would go back to 1 and when decreased past 1 would go back to 12. That would save programmers a lot of code and validations; one would not have to check the result of increasing a ‘Month’ variable: it would always be valid. And it could even be done with Strings, if memory serves me well.

So…

declare vacation as Month;
vacation = 11;
console.log(vacation + 3) // Outputs 2

or…

declare vacation as Month;
vacation = 3;             // Sets vacation to 'March'
console.log(vacation - 4) // Outputs 'December'

Of course this is not real code but it serves to ilustrate what I mean.
Is there anything simmilar in Javascript?

Nothing like that in JavaScript. You have primitive types, object types, and the ability to define your own object types with classes, but not much beyond that. Variables like this will always be coerced to primitive types when used with operators like subtraction (-). So no matter what you do, subtracting 4 from anything will give you some number, or more likely NaN as the operation expects, and tries to derive from both sides, numbers.

The closest you can get is to create an accessor property which when set can apply some logic to wrap a number like your example. This can even be done on a global to appear to operate on a “variable” though its effectively an operation applied to a property of the global object.

{
  let val = 1;
  Object.defineProperty(window, 'vacation', {
    get () {
      return val;
    },
    set (value) {
      val = (value - 1) % 12 + 1;
      if (val < 1) val += 12;
    }
  });
}

vacation = 11;
console.log(vacation); // 11
vacation += 3;
console.log(vacation); // 2

Notice that the value has to be set (+= not just +). The behavior isn’t applied by the operation itself (addition), rather the assignment of the result to the property.

And generally you wouldn’t want to be doing this in the global space, but it could just as well be applied to a custom object. You’d just need to be sure to qualify the variable through that object.

Additionally, some built-in object types have special behavior that you can’t get with normal objects. Arrays, for example, have a “magical” length property that automatically reflects a value respective of the largest indexed property in the object. You can get this same behavior in your own, custom object type by extending Array.

class MagicLength extends Array {
  // ...
}
let magic = new MagicLength();
console.log(magic.length); // 0
magic[3] = 'value';
console.log(magic.length); // 4

Of course you could also just make magic an instance of Array if thats all you wanted, but extending it allows you to add extra functionality. And you do get some benefits of identity otherwise too.

console.log(magic instanceof MagicLength); // true
console.log(new Array() instanceof MagicLength); // false
2 Likes

Outstanding! Sure, it is not the same but that is OK. It points me to new ideas and I have learned something new.

Thank you, Senocular. :slight_smile: