- Introduction
- Phase 1: Getting animations syncronized
- Phase 2: Exposing this to LSL
- Phase 3: Possibly re-exploring puppetry and LSL generated animations
Syncing animations in Second Life has been something long desired by Residents. I am proposing a multi-phase solution to that problem and more.
First, we need a metronome that all the viewers can stay in sync with. We have message which I think fits the bill pretty well:
// SimulatorViewerTimeMessage - Allows viewer to resynch to world time
{
SimulatorViewerTimeMessage Low 150 Trusted Unencoded
{
TimeInfo Single
{ UsecSinceStart U64 }
{ SecPerDay U32 }
{ SecPerYear U32 }
{ SunDirection LLVector3 }
{ SunPhase F32 }
{ SunAngVelocity LLVector3 }
}
}
Specifically, we are looking at the UsecSinceStart parameter.
Pros:
- This is a high precision timer suitable for features that requires high accuracy such as animations.
- It isn't prone to integer overflow due to simulator restarts.
- It isn't prone to year 2038 problem because it isn't unix timestamp.
Cons:
- I'm not 100% certain how often this is sent, I think it is sent once per second, which should be enough.
We'd have to have a new animation message to convey our timestamps. Let's start with the base animation message and clean it up a bit:
- First, we don't need
PhysicalAvatarEventData, that to my knowledge isn't used. - We use a whole uint8 to convey one flag, that's wasteful! Let's turn it into a bit flag field.
- We need to convey the start time, preferrably in signed 64-bit integer to facilitate region crossing with regions that have started at different times.
With this combined, we can get:
{
AgentAnimationSynced High 5 NotTrusted Unencoded
{
AgentData Single
{ AgentID LLUUID }
{ SessionID LLUUID }
}
{
AnimationList Variable
{ AnimID LLUUID }
{ AnimFlags U8 }
{ UsecSinceStart S64 }
}
}
The reason we use a signed integer in UsecSinceStart is because if a agent
crosses a region into a newly started region, the animation would have started
in the past relative to the newly started simulator's start time.
We can use AnimFlags to tackle a feature request that has been long requested, and that is setting animation priority via LSL. We only need 3 bits for it.
Additionally, we can tack on the "AnimStart" as a bit. That's 4 used bits, 4 bits left.
We could use the other four bits for stuff like disabling animation translations (AKA: Deforms), but that's up for further debate.
We can sync up the animations in a intelligent way that reduces jitter and behaves much like AAA games do when syncing animations, and that's to control the speed at which we animate avatars relative to the metronome the simulator sends us.
If we are behind, just play the animation in a higher speed. If we are ahead, animations down. We should do this in using an exponential curve, as if we are close to the desired timestamp, we should notice no change in speed. But if we are behind, we definitely should catch up, even if it plays animations a bit fast.
Animations should ALWAYS start playing at the point in time where the offset says. We don't want people to pop into regions and see animations going wild while they load in.
The really nice part about doing animations this way is that not only can we expose animation priority control, but also get bone positions and rotations.
The simulator is the source of truth when it comes to the animation time sync, and it is also aware of what animations are playing, so if we simulate a avatar skeleton on the simulator, we have all the details we need, without requiring the agent to send it's information into the simulator.
This solves various problems, including potentially opening the door back for puppetry. (By reducing the amount of bandwidth required)
One of the major problems with puppetry was the sheer amount of bandwidth and messages sent due to streaming animations. With the this RFC implemented, we don't have to stream every animation from and to the viewer. We would only need to stream animations that the agent truely wants to animate, if any at all. I would not see this being a bandwidth issue at all anymore.
This could give additional support to posers that exist in TPVs, and potentially as a new feature in the Linden viewer, or allow people to stream animation previews from blender into Second Life.
The second problem with puppetry is the potential for additional bones on the skeleton. This can be solved by having the bakes service "bake" a skeleton for the avatar, and using the bone indices in the skeletal bake as the bone list, rather than using the avatar skeleton. To conserve bandwidth, we'd assume that the avatar is using the base default SL skeleton unless told otherwise by the bakes service.
This, on top of a even tigher protocol (Bitstream perhaps?), I could see puppetry becoming feasible.
I would be willing to write out a new protocol for puppetry that follows a bitstream style message parameter that sends the bare minimum of data. See https://gist.github.com/FelixWolf/d91632a719a4e91221ff468f3fee637a for my WIP.