Test Post Do Not Eat
I love eating catfish.
People who live near the sea say they're gritty and to go eat a real fish like a grouper or a snapper. I think they're probably right but there's a kind of pastoral LARPey appeal to grilled catfish and rice with some tempeh on the side (Lele Penyet). Catfish are also really cute with their little whiskers whereas most sea fish of good eating are not. Attractive parts of catfish are:
- Whiskers
- Little tails
- Stabbers
- Beady eyes
Wait, you might say.
I think that being cute makes them more appetizing. I don't really look at a monkfish and think it will be tasty until its flesh is divorced from its outward countenance. I had a monkfish liver once. In Japan this is called "ankimo". It was preposterously briny and soft like foie gras mixed with caviar. I want to try it again some day.
I got an email from Noma telling me that they'll open for Ocean Season soon. I might actually go this time.
Did you know that there is a Malay folk tale about Singapore being attacked by swordfish?
Here is an (almost) configuration-free GDScript utility function normalizer.
## Exponential Moving Z-Score normalizer.
## Allows stable normalization of utility functions of unknown range.
class_name EMZNormalizer
extends RefCounted
var mean: float = NAN
var variance: float = 0
var alpha: float
func _init(p_alpha: float = 0.05) -> void:
alpha = p_alpha
## Normalizes the value and updates the statistics.
func normalize(value: float) -> float:
if is_nan(mean):
mean = value
var norm: float = (value - mean) / sqrt(variance)
var diff: float = value - mean
var incr: float = alpha * diff
mean = mean + incr
variance = (1 - alpha) * (variance + diff * incr)
return norm
This is a test post but this should actually work for you, if you were thinking about it. It's derived from a paper about online normalizers for AI: Dynamic feature scaling for online learning of binary classifiers (Danushka et al.)
"Online" in this case means that model training must occur at the same time data points are collected. In other words, there is no way to normalize against all data that exists, only what has been seen so far.
Anyway, this normalizer should learn the range of your utility function by itself and only need configuring of its decay constant alpha. You can add clipping if you like, but beware that the range of Z-scoring is zero-centered and many values will have a great magnitude than 1.
I made this out of from frustration with utility normalizations commonly suggested by game developers. The GameAIPro article Taming Spatial Queries – Tips for Natural Position Selection is fun to read if you're just learning about position selectors, the least studied of game AI primitives to the point where it doesn't even have a standard name (it's "TPS" in CryEngine and "EQS" in recent versions of Unreal). But the article also seems to suggest that only unfavourable compromises are possible.
This test was meant to be a CMS test, mostly.
Thank you for reading my blog post. Salut.