guiltygyoza Blog About

Behavior interoperability

ELI5: (1) Behavior interoperability across on-chain realities will be manifested through standardized computational models. (2) To define computational model for behavior, we must define the computational model for the physicality of the reality in which these behaviors occur.

Object interoperability is behavior interoperability

What does it mean to have object interop?

An object is apparently more than a hexstring that identifies it uniquely in the scope of an ERC-721 contract. An object, in its digital representation, may comprise parameters that specify its property, and code that describes its behavior; property ultimately is manifested by behavior (e.g. restitution coefficient impacts behavior animated by physics engine; aggressiveness impacts behavior animated by AI engine). Essentially, an object is its behavior. Therefore, object interop means to have objects behave in consistent ways across different realities. Object interop is behavior interop.

Behavior is manifested via computational model

What is behavior?

A physical behavior is the changing of physical states of objects, which must respect physical laws – this is manifested by computational models that deal with the physicality of objects and the environment. An agency behavior primarily involves non-player agents making decisions with respect to sensory input and internal states to drive actuators, which requires an entire pipeline of computational models to compute sensor input,perform feature extraction, compute internal state transitions which carry out decision making, compute action sequences, and drive actuators. Behavior is manifested by its corresponding computational models.

Behavior interoperability required standardized computational model

It is thus apparent that to achieve behavior interoperability, we must standardize the computational models for computing the behaviors of interest; realities will be interoperable to the degree at which they are compatible to these standardized computational models. Notice that this is far more complex than “token standard”, where one defines the abstract interface of a class (contract) to regulate ownership and rules of transfer. This is also far more complex than file format standards and data structure standards. We need computational models defined for both physicality and agency.

Subject and object are inseparable

Physicality defines both what is perceptible by agents and what can be acted upon by agents. Not only do we need to define computational models for both physicality and agency, we need to define them in parallel.

Longterm: neural networks

As we witnessed in Moore’s Law and the advent of GPU propelling learning-based approach much beyond heuristics based approach starting 10 years ago (AlexNet), computational model ultimately scales with the confine of the underlying hardware. Deep learning workload could scale tremendously well with the improvement on photonic computation. Ultimately, I believe the computational models of both physicality and agency would be crucially constructed with neural networks. We could be a couple decades away from this given how early both verifiable computing and photonic computing still are.

Why does Topology focus on interop?

Interop tilts the power towards the creators, the truly creative minds. By tilting this power, more resource is redistributed towards those who can create, leading to richer and deeper worlds woven by their creation, which leads to variegated domains of creative endeavors attracting more creative minds, forming a flywheel. With interop we achieve this flywheel without subjecting creative minds to overworked routines in corporate environments where power structure favors the capital people, not the creative people. This is why Topology is excited about interoperability.