March 6, 2024
Time
9AM PST
Agenda
Update on formal work product presentation - Primal Cube
Shared link + collaboration on other topics to address with this asset
Validation framework feedback / discussion
GDC
Remedy Talks
18th to 22nd March
Update during next meeting
Update on formal work product presentation - Primal Cube
TJ will gather and set the presentation page very soon.
Validation Framework
USD Validation Framework : https://github.com/PixarAnimationStudios/OpenUSD-proposals/pull/29
We are using a cache for the validation.
Question from TJ :
https://academysoftwarefdn.slack.com/archives/C03GKF4DG7K/p1709671482409539Thomas Trently Has anyone had a chance to check out the validation proposal yet? I was reading through it and was curious if it had any concepts for caching data to improve runtime speed. I've had to consider this a few times in other validation systems where we wanted to check multiple components at different levels but didn't want to pay the look-up cost each time. ex. If I wanted multiple tests to process vertex data, I would want to loop vertices once and run each test while it has that data vs each test looping vertex data. Varun Talwar Hey Thomas, thanks for reading the proposal. I haven’t considered caching of scene data while test execution.. will keep an eye for it while I am working on the implementation. Varun Talwar I also feel it will be hard for the framework to introspect what individual tests are testing for such a cache. Maybe test writers can bundle tests appropriately in StageLevel tests and PrimLevel tests if there is an expensive operation they don't want to run for every single prim ... etc. Varun Talwar Also I will try to be on the call tomorrow, if anyone else has questions. Thomas Trently That would be awesome, it would be nice to have some more discussion on it. Varun Talwar For sure, all suggestions and feedback are welcomed
Caching should happen only for a particular run but not across different runs.
Eg : querying all the triangles only once.
The same context could be shared different test but for the same run.Validation should be very quick. So the idea would be to reduce the validation time.
There is a Fix It feature. It is awesome.
Internally, there was a big chat and they decided to have it manually.
But the client could, later on, decide to automatically apply some fixes.
Sometimes, content creators want automatic fix (for example, when working with Unreal, some flags, parameters, should be set up correctly.
Adding a flag to automatically apply some fixes?
It should be trivial thing to be wrapped by the clients given the right set of APIs, but framework still shouldn't provide a means to always apply a fix. Varun took note to extend the API.Every project will have its own sets of rules
parameters would be different
validators could be config driven (eg : naming convention)
Example of automatic fix:
new light schema
BUT what if there are 2 ways of fixing the issue
Attendance
License: CC BY 4.0, Copyright Contributors to the ASWF USD working group.