

But what i accept are tips and wisdom like your int16

On the other hand, like you said: it doesnt care, in my javascript dungeonmaster game is used this algo from me, without even knowing the redmcsb source:
Code: Select all
// this.seed = 99
srand(): number {
this.seed = (this.seed * 9301 + 49297) % 233280;
return this.seed / 233280;
}
randomInRange(lower: number, higher: number = 0): number {
if (higher === 0) {
[lower, higher] = [0, lower];
}
if (lower > higher) {
[lower, higher] = [higher, lower];
}
const range = higher - lower;
const draw = this.srand() * range;
return Math.floor(draw) + lower;
}
// later in code:
// randomInRange(0,29);

What motivates me here is, the" why". what am i not seeing, that on two platform, calling the same code, it differs. and i my brain doesnt even stop when sleeping.. In the morning i woke up, i thought, maybe the compiler used internaly for the calculations the int32 and then after the calculations forced it to int16. then i thought: was my amiga not a 32bit machine with registers D0/A0 in 32 bit? So it would make sense. Then i thought. 1987, wasnt the intel cpu halve 16/32bit at that time... when the 32bit is used, then all the shift operation pushed it automatically down to 16bit (even with the last step of modulo) or is endian an issue here? but the 99 is the only value from "external" source....
and so my mind drifts off searching for possibilities.. and this is fun

so im not a dataminer, but more an algorithm-miner

