|
|
|
|
|
Please take note! For mod developers working with Unreal Tournament 2003, this documentation is meant to be a starting point for your own explorations into UT2003, not a definitive guide. There will be differences between the documentation here and the product in your hands, and you may have to figure out quite a bit for yourself. Check out the Unreal Tournament 2003 page in the Unreal Powered area for links to community sites if you're having problems. UDN is a licensee support site, and cannot provide technical support or game-specific assistance to end users.
|
|
|
|
UnrealModelTutorial |
|
|
|
|
|
Licensees can log in.
Interested in the Unreal engine? Check out the licensing page.
Questions about UDN itself? Contact the UDN Staff.
|
|
|
|
|
|
|
|
|
Last updated by Tom Lin (DemiurgeStudios). Original author was Tom Lin (DemiurgeStudios).
- Creating Models for the Unreal Engine
- Goals
- Modelling
- Mapping/Texturing
- Bones/Enveloping
- Process
- Exporting from 3D-Studio
- Exporting the Reference Pose
- Exporting an Animation
- Batch-Processing
- Preparing your animations
- Exporting the animations
- Importing into UnrealEd
This is a document explaining the sometimes hairy process of getting a fully realized 3D model into the Unreal Engine. Let's dive right in.
Before we even begin to think about modeling, texturing, or other "process" questions, it would be wise to consider exactly what the models are trying to achieve, what importance they will have in the game, and what the target platform will be. During the course of this document, I will detail the creation of two models from start to finish, male and female humans.
The goals for these models were as follows:
- The models will be the central figures in the game.
- Models must be realistically proportioned.
- The models will require full lip-synching ability.
- Fully articulated hands (sign-language capable).
- The models must be usable with in-game environments on modern PC and console hardware.
The goals will affect pretty much every step of the process later on. In my typical order, those steps are: modeling, mapping, texturing, bones/enveloping, animation.
Before beginning the modeling process, start with the standard questions. How many polygons do I have to work with? What style is my model to be in? Is my model going to be replicated in a scene? What parts of the model will receive the most attention? For the models I made, most of the answers are cut and dry. I had a budget of 4000-4500 polys per model, realistic style, central characters. Since I was making people, I decided to give the face and hands the bulk of the polygons, especially since a wide range of motions were specified in my goals. Chances are they would be the focus of the models more often than not.
I use 3D Studio 4.2 for modeling, so some of my comments will be directed towards 3DS users. The modeling process in regards to Unreal requirements is fairly standard. There are several things to keep in mind, which may smooth the modeling process somewhat.
First of all, gaps in the model will not break the model. That is to say, if you have a hole in the mesh surface, it should still work fine. In the same vein, having polys cross/clip into each other will also work just fine. There may be minor artifacting/shifting where they cross, however, so again, cross triangles sparingly. I mention this specifically to point out their appearance in the eyes of the models.
On looking at the eye, one can see that the iris portion of the eye is on a roughly square sheet of polys. I have transparency information in the texture's alpha channel that will make the iris appear circular when viewed in Unreal.
The only time clipping triangles are problematic is if the material assigned to both triangles uses the alpha channel of the texture. This is not a small detail problem, it's very apparent that something is broken.
The eye works fine, because only the iris has alpha information. That which it clips into (the surrounding eye socket) uses no alpha channel, so there is no fighting for draw order. Therefore, early on in the planning phase for your model, identify where you will be using alpha on your model and make sure that two alpha triangles do not cross. If there is chance of crossing, as there is in the girl model's hair, then try to minimize the angle at which the triangles meet, perpendicular is the worst, with the severity dropping as the faces get closer to parallel.
One more thing to watch out for when modeling in 3D Studio is that 3ds loves to give you models in quads. There isn't a big problem with quads, but for a greater degree of control and clarity, I always use the modifier -Turn to Poly- in the modifiers list. I then check, Limit Polygon Size, and set the max size to three.
Mapping and texturing is the next logical step after modeling. First of all, make sure that your texture method is guaranteed to work with your existing art pipeline. I use a third party UV unwrapping program to map models, although many people find that 3D Studio's -Unwrap UVW- modifier is enough.
Unreal requires that textures be 32 bit targas, whether or not you need alpha channel information on the texture or not. Take care that while making your texture packages, you have the correct bit depths.
The only other major concern while texturing for UnrealEd is the aforementioned "no crossing alpha textured triangles" rule. For a more detailed explanation, see above in the modeling section.
Just as a general texturing note; I made the textures for the male and female much larger than I anticipate them being used at - 1024x1024. Scaling down is easy, scaling up is bad, to say the least. I have my textures spread across separate maps, two completely opaque and one for all the bits that have alpha channel information. This division helps keep the alpha crossing from happening inadvertently, and also makes life a bit easier for the person who has to mess with the alpha settings in UnrealEd. If your model will have any alpha information at all, a minimum of two separate maps is recommended.
In the above image (two separate texture maps) you can see that on the alpha texture map there is hair, teeth, and eyes. The other texture map contains solid parts of the model, such as the face and arms, as well as the base layer or hair.
The first question when beginning the boning process is whether to construct your own skeleton, or use character studio. In the case of the male and female models, we chose to use character studio, with our own additional bones attached to that ready-made structure. This tactic works fine in UnrealEd, so feel free to make models with extra appendages, super long hair, or "Dead or Alive" cleavage.
One of the first issues with the bone process that may come up is the concept of blending animations. For example, the female model I created has a large set of bones that control her hair. So, I could conceivably make a standalone hair animation that would be appropriate for both running and horseback riding. Once this is done, I can export the hair animation, and programmers can import this information into both running and riding animations. This has the potential to cut down on animation workload a great deal, but this is currently handled from the programming side of things. The important thing for animators to keep in mind is that this blending process happens to all the bones that attach from a specified bone, on down through the skeletal tree to the end. For example: the blending could be told to start from the middle of the hair bone, and then only the end-hair animation information would be shared.
For this reason, it's a good idea to have one root bone that comes off of the main skeleton, which controls all the other bones lower in the hierarchy. This allows ALL the hair to be blended at once, since you have a single bone parent. This is much cleaner than having each string of bones (strands of hair, fingers) connect to a part of the skeleton that you don't wish to receive blending information (hand, head).
While creating models for UnrealEd, remember that the animation and model data are stored independently of each other, in .PSA and .PSK files. This is somewhat unique to Unreal, and deserves some explanation. ActorX is the plugin that generates the filetypes, see below for a more comprehensive explanation of ActorX. The reference pose, which will go into the .PSK, has only the model and bone information, none of the animation information. This is where you set the influence of the bones that will affect the assorted vertices in your model, whether by envelopes or locked verts. Conversely, the animations to be stored in .PSA files hold only information relating to the movement of the bone structure, and none of the weighting information.
Sharing animations is another issue that should be kept in mind. If you have two models that are roughly the same, like I did (man, woman), it may be possible to share animation information between the two. However, this will only work if the two models share identical bone names for corresponding bones. For example, in the male and female models, we added in eyes, a tongue, and lips; these all might potentially have overlap in animation between the two, cutting the amount of work that's needed, so the bones were given the same names. This means you can take a .PSA animation file that is generated for one of the models and drop it into the .PSK reference pose for the other. Of course, for best results it's probably a good idea to tweak all the animations anyways for separate models, but riggers should keep in mind that animators may want to re-use animation in new models.
When the bones are all in place, it's time to initialize your model. I'll just point you towards the excellent SkeletalSetup guide to model preparation.
I will take this chance to reiterate the importance of rigid vertex-link assignments and blending of 3 links. This is important, because it's easy to miss these options and your model and animation will still behave nicely - that is, until you get to UnrealEd.
The remainder of the rigging is fairly standard. Envelopes and locked vertices both work fine. There is one potential problem, though it's pretty rare. If a locked vertex is accidentally set to no influence from any bone, then when imported into Unreal vertices will jump about haphazardly, and the model will tear in obvious ways. If you check the logs during the model import process into UnrealEd, it will spit out a null vertex error. Unfortunately, there is no easy way to track down which vert is broken. Another problem with locked vertices: once you start locking down vertex weights, it's vital that your model not change overmuch, as this may affect the vertex numbering. So, adding in polys is sort of bad, and removing polys is extra bad.
After your animations are done, all that remains is to make your .PSK and .PSA files. If for whatever reason you need to change the envelopes on your model (clipping, pinching, etc) you can just change the file that will be used to export the .PSK, the reference pose. The changes made to the .PSK will appear globally over all animations played on it within UnrealEd. Conversely, this means that if you make bad weight changes, all the animations will receive them.
To clarify the process for beginners, take a glance at the flowcharts below. The first might be thought of as the more straightforward, traditional method. Each step happens in order, and only moves onto the next when the previous is done. For an individual working on all parts of model creation, this makes a lot of sense.
However, if you have multiple artists, then it's inefficient to have people waiting for each other to finish up steps. In this method, the modeling naturally must be finished first. From there, the work can fork slightly, which allows animation to proceed at the same time as the texturing, which can speed the process up a bit. It's important to note that locking vertices should not take place until after the texturing is finished, frequently, texturing a model reveals inconsistencies or poorly constructed areas on a model. These should be sent back to the modeler for fixes, which can be incorporated easily into an animator's existing skeleton, as long as the animator has restricted himself to working with envelopes.
Please note: this second flowchart may be a touch misleading. The second half unwrapping/texturing boxes are not mandatory, they simply indicate that changes to the mapping or texture can still take place after the locking of vertices. So, a possible schedule for a texture artist might go something like this:
- Receive model and begin unwrapping process.
- Unwrap model and rough texture.
- Find model problems and send model back for changes.
- Lock vertices, so that animator can work on final rigging.
- Work on a more refined unwrapping/texture.
If you are splitting your modeling/texturing/animation process among two or more artists, make sure that your pipeline works completely before beginning work on the models. For example, texturing in Maya is a poor choice if modeling/animation is to be done in 3DS Max, since Maya will re-number vertices on export of .OBJ files. This alone will destroy animation and texture work, if handled in the wrong order in a pipeline of artists.
The next-to-last stage of getting a model into unrealed is to generate the files which are imported into the engine. These are stored in a proprietary format generated by the ActorX plugin for Max or Maya.
The first thing that needs to be created is the .PSK file. PSKs contain the geometry, texturing, skeleton and vertex-weighting information. Generally characters are placed in a reference pose as described earlier in this document.
- Open ActorX in 3D-Studio. To do this select the utilities tab and click the More button. Select ActorX from the list and click okay.
- Click the Browse button to select a folder where you would like to save all your ActorX-generated files. This will fill in the output folder field with the path you select.
- Pick a name for your mesh and type it into the Mesh File Name field
- Open the Actor X - Setup section of the tool
- Check only Persistent Settings, Persitent Paths and All Physique Meshes.
- Back in the main section click the Save mesh/refpose button. This will create a .PSK file in the specified directory. You'll eventually import this file into UnrealEd?.
In order to export animations you will need to have a model with the same skeletal setup. To export a single animation:
- Open your animation in 3d-studio.
- Fill in the output folder as explained in the above section.
- Fill in the Animation File Name field. This will become the name of the .PSA file generated by ActorX.
- .PSA files can have many animations in them. Each animation is distinguished by the value filled in to animation sequence name. Fill in this value now.
- Define the frames of the animation you wish to export. You can do this either by entering the frames into the animation range field. Do this by filling in the number of the first frame followed by a hyphen followed by the number of the last frame. For example, if you want to export frames 5 through 32 you would fill in "5-32". Alternatively you can click the Time Configuration button in 3D-Studio and fill in the values for Start Time and End Time there. If you will in these values you should leave Animation Range blank.
- In the Actor X - Setup section of the tool, check cull unused dummies.
- Click Digest Animation
- Click Animation Manager which will open the following Animation Manager dialog:
The left list is all the animations you have digested thus far. The right column is the list of animations that exist in the .PSA file you selected in the Animation File Name field. To refresh the right column, click Load.
- Select the animations you which to export from the right column and click the Copy ==> button.
- Click Save. The .PSA file will be created.
As the list of animations for each character grows the process for creating the .PSA files starts to take a very long time. In order to simplify this process you can have ActorX process all of the animations in a given folder in one step.
Before batch-processing you will need to format your animations.
- They will need to be in .max file format
- All of the animations that are to be processed together need to reside in a single directory
- Each file must have its start and end time set properly. Click the Time Configuration button in 3D-Studio to set this up properly.
- The files should have the same name as the desired animation name once in the engine.
Once your animations are properly setup follow these steps to export them all to a .PSA file.
- Fill in the Output Folder field in ActorX with the desired location of the .PSA file
- Fill in the Animation File Name field with the desired name of the .PSA file.
- In the Actor X - Setup section of the tool, check cull unused dummies.
- Also in the Actor X - Setup section click Process all Animations. ActorX will prompt you for the folder where all of the animations reside. Select the folder and click Okay.
- Now that the animations have been imported click Animation Manager which will open this dialog:
The left list is all the animations you have digested thus far. The right column is the list of animations that exist in the .PSA file you selected in the Animation File Name field. To refresh the right column, click Load.
- Select the animations you which to export from the right column and click the Copy ==> button.
- Click Save. The .PSA file will be created.
When the models are made properly as described in this document, this process is quick and painless. If your models come out garbled or missing materials, go back and make certain you;ve done all the setting up properly.
First, import the model into UnrealEd. To do this, Open the animation browser and choose File->Mesh Import from the menu. Select your .PSK file and click Okay. Fill in the values for Package and Group and name that you would like and click Okay once again. If all goes well, the model should appear, un-textured in the viewport, probably looking something like this:
To apply your textures to the model first import the textures into a new texture package. For textures with special rendering requirements, like two-sided, alpha or masking you will need to create materials. If you need help with this see MaterialTutorial. With your textures imported, one by one assign them to the list of materials in the animation browser by selecting a texture in the texture browser and then clicking Use in the Skin section. It will take a little bit of guess work to figure out which material goes where but after some messing around your model should look something like this:
Notice the animation browser distorts models somewhat. Your models will not have this distortion in-game.
Finally, you can import the animations from your .PSA and assign them to the newly imported mesh. Go File->Import Animation from the animation browser. Select your .PSA file and click Okay. Enter the package where your model resides into the Package field and click Okay once more.
Select your model and animation packages from the pull-downs at the top of the browser and select Edit->Linkup Anim and Mesh from the animation browser main menu. Pressing play at the bottom of the browser will start the animation playing.
The last step is to save your animation package. Do this by going File->Save in the animation browser.
|
|