Categories
Collaborative Unit Term 2

Group Project

Starting this project I initially wanted a smaller group, as I knew it would be easier to manage with the workload and timings. I decided to collaborate with Carlotta as we both have similar interests within VFX. We got in contact with a MA Virtual Reality student as we were hoping to incorporate and learn more about VR this term. We found a student, Rita, who was happy to work with us. Our initial discussions consisted of agreeing on our aesthetic and theme, both myself and Carlotta originally wanted to create an advert as it ensures a clear purpose and narrative for our project. However, we are aware that compromise will be needed.

I wrote up a short description of our initial concepts for our production and Carlotta compiled images for our mood board.

After speaking with Rita she was on board with the aesthetic and ideas we were going for. She had the idea of incorporating the concept of wellness and mental health in our production, so the project developed to become a virtual space that a person can inhabit for the purpose of reliving stress and escapism.

Rita informed us that her tutor thinks another VR student would be needed, therefore after meeting we decided to combine our group with another. We are now working in a group of six; four VFX students and two from VR.

The concept is remaining largely the same, creating a VR experience within the overarching theme of Escapism.

The plan is to create 3 different worlds within virtual reality, an energising and colourful one, a calming and relaxing world, and an adventure experience. We will create these worlds in pairs, both me and Carlotta will work on the calming experience, adhering to our mood-board aesthetic.


After discussions with various members of the group, me and Carlotta had decided on a rough layout/idea for our room. Speaking with the VR students we were able to decide 3 interactive elements to our experience: breathing exercises with a flower, lighting candles and interacting with a gong.

We needed to ensure we were modelling and texturing our items in the correct way to be transferred into Unity.

I also was tasked with animating the face of a character within the VR experience: a penguin. I used blendshapes to create the correct expressions and movements for the script that Lauren had written.

Lauren wanted me to animate a hug on our penguin character, this is because he will be interacting with the user in VR. I had not rigged and skin painted the model, so I just did a very basic blendshape animation, using the vertices and faces to move the arms.

https://www.youtube.com/watch?v=GhqzQsd4o30

Another interactive element to the VR experience is with the penguin screaming. We want the user to be able to scream along with it, so I needed to animate this also.

The issue we faced is that when I exported the animation for Lauren to use in Unity the timing had to be perfect to fit the audio as she cannot edit it further.

This meant that I had to align the talking, pauses, and scream with the voice over. I primarily used the graph editor to do this.

Animating the penguin in Maya was not too complicated, the real issue that arose was with exporting it to Unity. The model would constantly flicker between hard and smooth shading, and we noticed the scream animation did not translate well.

https://youtu.be/-w4TGGBF3HI

We needed to ask for advice as to why this was happening. However, due to time constraints it could not be fixed before the deadline.


I modelled and textured the gong for our scene. I am unsure if the colour scheme is right at the moment, I will be able to change this as needed depending on the overall aesthetic of our room.

This is another interactive element to the experience, The handle needed to be made and exported separately, so the user can pick it up in VR.


I started working on modelling the candles for another interaction. The plan is to get the user to light them when in the ‘calm’ room.

I created colour maps for the candles, as I wanted a gradient on each of them.

We have an issue regarding the textures transferring from Maya to Unity. Me and Lauren sat together to figure out how to correctly transfer the models, we followed a YouTube tutorial. Importing the items to Unity was a success, however we found that some of the textures were not able to transfer, for example the frosted glass – I believe this could be because it is based on a preset in Arnold. We couldn’t find a solution online, so the VR students will ask their tutor for advice.

Another issue we encountered was that the models were in ‘smooth shader‘ in Maya, yet when opened in Unity they were blocky. Again, we couldn’t figure out how to solve this so Lauren will speak with her tutor.


I also needed to make a corinthian column for the scene. I started the leaf detail in Zbrush, then I imported it to Maya and did a duplicate special to create the repetition around the column.


Next I needed to join Carlotta in modelling furniture for the room.

I wanted to model an interesting sofa, I found a good one that I believe would fit the aesthetic. I initially started the model in Maya, I attempted a few methods trying to decide what the best technique would be.

I decided the easiest and quickest way would be to sculpt by eye in Zbrush, with the symmetry tool activated I was able to model the sofa relatively well.

I also modelled a vase in Zbrush, I found an interesting reference photo then sculpted it using DamStandard and Smooth tools.

I started the rug in Maya using a plane, then imported into Zbrush to add texture using GroomTurbulance.

After importing the furniture into a scene together I started fixing the UV’s on the sofa, in order to efficiently texture later. I created the UV in Zbrush, then cut and sewed the seams in Maya as needed.

I used Arnold pre-sets to texture the furniture, this was to get an idea of how the light would work on the models. I also wanted to initially experiment with colour scheme, but I will use my own textures to adhere to the palate from our mood board.

As transferring textures from Maya to Unity was an overarching problem we faced, Carlotta, Rita and I decided that the scene would be textured in Unity in order to fit the deadline.

Our room is completely modelled and the UV’s are all organised, therefore we still hope to texture the room after submission.

We went to the final presentation of the VR students, in which we got feedback on the work from their tutors. This gave us a few days to work on the aesthetic of some of the rooms, and any elements we needed to change last minute.

https://www.youtube.com/watch?v=LAj-5ItvnDQ

Overview:

Overall, this group project has been interesting, I enjoyed working with the girls from VR and feel I have learnt new things regarding VR software and how best to import and export files for it. Communication has been fine during the process, Lauren took the role of lead and was very easy to contact, this meant creating good models and animations was possible.

Despite this we also had many issues during the process. Firstly, one of the group members had left mid-project, this meant we needed to pick up the extra work. I had to animate the penguin as a consequence of this, and upon reflection I believe we were very ambitious with what we could achieve within the time-frame.

We also faced issues with texturing models in Maya and exporting to Unity, we were not able to overcome this before the deadline. Therefore the VR experience is not completely the aesthetic we were initially after. I have learnt that compromise is necessary when working in a group, as well as good and clear communication.

Categories
Collaborative Unit Term 2

NUKE

WEEK 2: Motion Vectors

This week we continued learning how to efficiently cleanup a plate. We were shown various methods in which to do this.

I needed to first remove the tracker marks on the face. I started by rotoscoping each dot, using a circle and ensuring it tracked efficiently.

Next I used a merge (stencil) with the roto and the footage, then using a blur node at the right level to successfully minimise the dots. This got rid of them at first glance, with a premult after it. I again merged (over) the roto and the plate.

Then I needed to re-blur the dots, and use a shuffle and grain to soften it. I also used an edge blur on the alpha to achieve the same effect.


WEEK 3 & 4: 3D tracking

We discussed how to distort and undistort the plate we are using in Nuke, this is important for our collaborative project with the Crypt scene. We were also introduced to 3D tracking and building in Nuke, using the Scanline render node specifically.

We used an example shot to practice 3D tracking. We were advised not to track anything with reflections, such as windows or water, so we would need to mask them out of the shot. We used the CameraTracker node, inputting the relevant camera information.

For this semesters project I used the CameraTracker node, inputting the details of the camera that was used, then the number of tracks (features) we wanted. I tracked then solved the tracker, we were told an accurate result has an error usually between 0.5 and 1.

I deleted the unsolved nodes, then recalculated, refining the accuracy of the pass.

I added an origin and ground plane with the points, this ensured the tracks were organised and correctly placed. Then I created cards for for the walls and ground, and merged them over the scene.

Once completed I exported the trackers and camera to Maya.

In class we then discussed to use of PointCloud and ModelBuilder on our example plate.

In week 4 we looked at Projections, first establishing the difference between using a textured card and projecting onto shot. We were taught the difference between cameras whilst using a projection, and the method of using a ‘project3D’ node with a patch.

The use of a framehold node was also explained, firstly when used on a camera, then when used on a scanline render.

WEEK 5 & 6:

We continued looking at projections and the problems that may arise such as stretching, doubling and resolution issues. We also went through the various different projections that you can do.

This week I also continued working on my Crypt shot, I tracked the front wall in Nuke, using one of the projection methods we were taught.

I rotoscoped the opening in the wall on frame 0 as it had the whole area in shot, I then used a framehold on it and projected it onto a card within the scene using Project3D. When I did the roto, I needed to apply an invert node, so that the correct part of the shot became the alpha. I also added an edge blue to soften the edges, for a more realistic roto.

Week 8 & 9: Green Screen

We started learning how to manage working with green screens in Nuke, firstly however we were shown how you can manipulate an image using the colour controls. Specifically we were shown how to use Keyer and Colourspace nodes.

For the homework, I edited the background using the luminance key in alpha, then blurred and channel merged it over the plate, following a method we were taught in class. I graded this to create a pinkish glow, this is because the foreground is red tinged so I wanted them to match.

I blurred the entire background slightly. Then I used a keylight to get an alpha for our front image, denoised it and then added an edgeblur.

Once complete, I merged them over each other.

WEEK 10:

To create the final shot, I used my previous roto and merged it over my render of the stationary steam engine. I had already created realistic lighting in Maya, and input all the correct AOV’s.

I followed our tutors script to correctly grade and colour-correct my sequence, I needed to seperate the AOV channels, I mainly used the diffuse_direct, diffuse_indirect and specular_indirect. I graded and coloured them individually, to get accurate lighting. Most importantly, I wanted to get the black points of my render and the shot to match.

When happy with the grading, I applied a write node, and rendered out the overall sequence.

Then merged the roto over my graded footage to create the final shot.

I managed to remove the markers on the floor, using 4 tracker nodes and rotoing them. Then I blurred the roto until the trackers were gone, I pre-multiplied then rendered out the final footage. I am having trouble with Nuke as it is not following the tracks correctly, I have to move them on each individual frame, which is taking too long.

An issue that I have been having is that the track wobbles a lot, when the scene is in Maya it is fine, however when I render it out and import to Nuke the track shakes again

https://youtu.be/q9ME-d5G6BE

I have tried so many different ways to fix this but keep coming to the same issue, I don’t have time to re-do the camera track from scratch. Ideally I would do that and start the process again, I will try this after the submission. Overall however, I am not happy with this which is unfortunate as I have completed the model with animation and textures, and graded it to the scene.

I am attempting to fix this problem in Nuke, I started by tracking 4 points on the machine, then changed the node to ‘remove jitter‘.

So it didn’t really work. The shot is still very unstable, I really can’t understand why when the track works perfectly fine in Maya. I will need to speak to a tutor regarding this issue.

I have found out what the issue was, the frame rate of the footage and the machine animation were not the same. I will render out a new sequence to input into my Nuke script, which should be stable.


Categories
Collaborative Unit Term 2

MAYA

WEEK 1 & 2

We were informed that this term we would be building a stationary steam engine in Maya, to be composited into a scene. We will be working on modelling, lighting and texturing, alongside this looking at more depth into rigging and hierarchy.

I started by compiling some reference images for my engine, using elements from different machines to inspire my own.

In this weeks class we created a simple wheel and piston system, we were taught how how to correctly rig this. Firstly, we very quickly built a basic piston, with a piston sleeve, and a wheel attached. We did this by using polygon shapes and manipulating them with various tools we had previously learnt.

We added locators at two ends of the piston, and using the Aim Constrain tool, we were able to ensure the piston would always move in the right direction (towards the locator).

We were also briefly taught how to use MASH to create the illusion of a rotating belt on our machines.

The animation is not totally accurate as the machine is not moving in the same direction as the belt. This exercise was primarily to get familiar with the tools to rig and animate our own models, so this isn’t a major problem.

I also started this term continuing some work on my previous Maya scene. I had the idea to create an advert for the bread, this is more of a personal project now – I just want to see if I can create a successful and completed narrative. I started drawing a storyboard for my idea. I’ll come back to this when I have more time.

WEEK 3

This week we continued to create our own steam engines, I drew a basic design based on elements of the machines I liked from reference photos. I chose these parts because I believe they will create a realistic and complex machine. Furthermore, the main aim of this task is to fully understand and create a successfully rigged and animated stationary steam engine, so I chose a design that involves a lot of moving parts.

I started modelling the wheel, it was taking some time to figure out exactly how I would like it to look. I started with a cylinder polygon primitive, adding to the geometry to create a sufficient amount of faces to manipulate – deleting the unnecessary ones to form the shape of the wheel. Extruding and using edge loops as necessary to create the detail. I also used a pipe polygon primitive to create the outer rim of my wheel.

This was the first attempt, I didn’t like the thickness of the spikes of the wheel, therefore decided to recreate it – using the same method.

I started on the steam tank, manipulating the faces of a cylinder to create the desired shape, extruding and using edge loops when needed. I added details such as bolts and panels to my machine as necessary.

To create the latch for the steam tank I originally constructed the right shape using a polygon cylinder, then used boolean difference to create the punctures. This was not an efficient method as it ruined the geometry.

Therefore, I decided to try a different method. I used a polygon cylinder to create the upper part of the latch, deleting the faces to create the hole. I then mirrored this and extruded them into each other. This method kept the geometry relatively intact, so I continued with it – I want to ask my tutor about this in the next class however.

I manipulated the geometry as needed to replicate my reference image.

WEEK 4

We continued modelling our stationary engines this week, our tutor helped me with cleaning up the geometry on the latch. He also showed me an efficient way of creating the organic shape between the two latch bolts. Using the extrude tool and manipulation of the vertices I was able to achieve the look I was after.

Next I wanted to clean up the UV’s on what I had already modeled, ensuring they were accurate by using the camera-based tool, and unfolding in the UV editor.

I also wanted to create a ‘hammered’ effect on the steam tank. I achieved this by adding divisions then using the sculpt tools to organically create the right aesthetic. I wanted the bolts to look hammered in also.

I continued modelling the details of the machine, going back and forth with the reference images.


I needed to sketch out the details of my engine in order to model accurately and efficiently. It helps me to visualize what I am designing.

WEEK 5

I continued modelling my stationary steam engine.

I used the lattice tool to bend the back-plate onto the steam tank, which my tutor had shown me as the most efficient method.


I continued modelling the front mechanism. I wanted to attempt animating part of the model, to ensure that the mechanisms I had modeled were efficient.

I used a Youtube video to see how the movement works. I also re-watched our tutorial on constraints and locators and using these same principles I attempted to apply it to my own machine.

First I established the pivot points of my movement, then placed locators at each of these points. It was important to ensure the central pivot and the locators were snapped together, I did this using the wireframe shader and two perspectives.

Then I used both the Aim and Point constrain accordingly, and put each element in the correct hierarchy in order to achieve believable movement.

This whole process took a bit of time, it was difficult to fully understand which points needed movement and what needed to be constrained. Furthermore what type of constrain each part needed.

I have successfully created the animation, however there are a few issues remaining with it. Some of the locators are jumping during the sequence, I will speak to my tutor about how to fix these problems in our next class.


I now want to build the piston at the front, I roughly drew out how it would look first.

I had a lot of issues when animating the front piston, this is because the aim and point constraints were not working efficiently on my model. After discussing with my tutor, I cleaned up the geometry so it is easier to manipulate then we tried the constraints again.

Upon reflection, we decided that the issue was with the model itself, I believe the proportion sizes of my handles made it difficult for the models to follow each other when animated.

With my tutors help we managed to get the piston working well enough for the task, we needed to manually offset certain parts of the piston and keyframe them in order to achieve this however.

I continued modelling the details of my machine using the same techniques as before.

The animation and machine is finished, I just need to slightly animate the belt and add the textures.


I imported my tracked scene from Nuke into Maya, then imported my machine as a reference. I aligned them up and added the image sequence.

Due to time constraints I decided to texture the model on Substance Painter, I watched a few tutorials and looked at reference photos to decide on my colour scheme and aesthetic. I knew I wanted to use copper as my primary metal.

I ensured the UV’s were correct for substance painter then added the right textures, then I added a paint layer and drew on the wear and tear that I wanted, as well as discoloration and stains.

I wanted to use a similar colour scheme to the scene, so would use the image plane as a reference point.

Once done, I imported the texture maps to my Maya model – at this point the specular is too high in my opinion, so I will play with the settings to get an accurate finish.

I updated my reference in the Maya scene, so my model would show the textures. I started working on the lighting, using two area lights in the corners of the room, one was a cylinder to create more of a soft look. The other needed to create sharper edges to the shadows so I used disk, and changed the spread value.

I also changed the grade of the light source, to be slightly more green. I also need to play with the exposure and intensity of the lights, as it looks slightly too dark right now. My tutor advised me that the black points of my model and the scene need to match.

I am also working on re-texturing the front piston. I am planning to do this in Maya, my tutor showed me how to create a good worn texture for the metal using the hypershade.

I used an image of hammered copper to add to my colour maps for the piston, painting on the discolouration and texture.

Refining the textures of each part of the machine was taking a long time, it is something that I could work on continuously as I feel you can constantly improve it. Overall when I was happy enough with the look, I created and selected the AOV’s we needed then rendered out the sequence and imported to Nuke, ready to be graded and colour corrected to fit the scene.

I have found an issue with my work. The camera track is stable in Maya yet whenever I render out the sequence and import to Nuke, the track becomes very shaky. I am unsure why this keeps happening, I rendered out my Maya sequence twice, even trying a more stable track. Yet the problem persisted, I have no idea how to fix this and due to time constraints I will not be able to before submission. I plan to do this after.

https://youtu.be/q9ME-d5G6BE

You can see just how much the machine is unstable in the shot, I really have struggled with trying to rectify this, so I think ideally I need help from one of my tutors.

I have found out what the issue was, the frame rate of the footage and the machine animation were not the same. I will render out a new sequence to input into my Nuke script, which should be stable.