Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

First Frame, Last FrameintEncoding to movie files typically looses loses the start frame, making it a pain to identify which frame you are looking at. We could look at doing this with timecode, but sometimes you want both timecode and a frame number.
Source filenamestringSomething to track where the encoded media came from.
Source IDstringUnique ID from vendor creating content.
Source frame ratefloatIf you are reviewing a proxy, but still want to remap back to the source frame, knowing the source frame rate is required (DO WE NEED THIS AND LAST FRAME?) – useful for high-frame rate media, e.g. 120 fps - (MIGHT MAKE SENSE AS A STRING TO HANDLE 59.94 better?)
image active areaxMin, yMin, xMax, yMaxThe bounding box of the picture location within the image. This is used in cases where the image is a re-processed version of the source frame, e..g. where a 2.35 aspect ratio picture has been padded to HD (perhaps timecode is burnt in, etc), this would allow any annotations to be always defined relative to the source frames, so would be able to be correctly overlayed on top.
Watermarking?StringDocument what sort of watermarking has been applied? - invisible, burnin?
Slate LengthIntDuration of slate length (0 if no slate).
Display TypeEnum

Stereo left/right

Stereo top/bottom 

Long/Lat VR mono

Long/Lat VR Stereo top/bottom

NOTE: This should be based on existing standards, e.g. https://github.com/google/spatial-media/tree/master/spatialmedia

Color SpacestringMany file-formats do already have options for color spaces, but certainly for internal reviews facilities may decide to encode to a non-standard color space. For media that is crossing facilities we should stick to known embedded colorspaces, and allow existing tools to remap where necessary.

...