Binary Patterns of Integer Functions
Binary Patterns of Integer Functions
Intricate patterns can be produced from the binary number representation of the results of integer functions. Is there a cellular automata rule (and initial condition) that can produce the same behavior and computation?