LUCRARE DE LICENȚĂ Functional Map Creation propusă de Ilau Marius-Constantin Sesiunea: Iulie, 2018 Coordonator științific Lect. Dr. Arusoaie Andrei… [617691]
UNIVERSITATEA “ALEXANDRU IOAN CUZA” DIN IAȘI FACULTATEA DE INFORMATICĂ
LUCRARE DE LICENȚĂ Functional Map Creation propusă de Ilau Marius-Constantin Sesiunea: Iulie, 2018 Coordonator științific Lect. Dr. Arusoaie Andrei
UNIVERSITATEA “ALEXANDRU IOAN CUZA” DIN IAȘI FACULTATEA DE INFORMATICĂ Functional Map Creation Ilau Marius-Constantin Sesiunea: Iulie, 2018
Coordonator științific Lect. Dr. Arusoaie Andrei
Avizat, Îndrumător Lucrare de Licență Titlul, Numele și prenumele ________________________________ Data ____________ Semnătura ________________ DECLARAȚIE privind originalitatea conținutului lucrării de licență Subsemntatul(a) …………………………………………………………………………… domiciliul în ………………………………………………………………………………………… născut(ă) la data de ………………..…., identificat prin CNP ………….……………..…………, absolvent(a) al(a) Universității „Alexandru Ioan Cuza” din Iași, Facultatea de …………………… specializarea …………………………………………………………, promoția ……………………, declar pe propria răspundere, cunoscând consecințele falsului în declarații în sensul art. 326 din Noul Cod Penal și dispozițiile Legii Educației Naționale nr. 1/2011 art.143 al. 4 si 5 referitoare la plagiat, că lucrarea de licență cu titlul: ___________________________________________________________________________________________________________________________________________________________________________________________________________________________ elaborată sub îndrumarea dl. / d-na ________________________________________________________, pe care urmează să o susțină în fața comisiei este originală, îmi aparține și îmi asum conținutul său în întregime. De asemenea, declar că sunt de acord ca lucrarea mea de licență să fie verificată prin orice modalitate legală pentru confirmarea originalității, consimțind inclusiv la introducerea conținutului său într-o bază de date în acest scop. Am luat la cunoștință despre faptul că este interzisă comercializarea de lucrări științifice in vederea facilitării fasificării de către cumpărător a calității de autor al unei lucrări de licență, de diploma sau de disertație și în acest sens, declar pe proprie răspundere că lucrarea de față nu a fost copiată ci reprezintă rodul cercetării pe care am întreprins-o. Dată azi, ………………………… Semnătură student …………………………
DECLARAȚIE DE CONSIMȚĂMÂNT Prin prezenta declar că sunt de acord ca Lucrarea de licență cu titlul „Functional Map Creation”, codul sursă al programelor și celelalte conținuturi (grafice, multimedia, date de test etc.) care însoțesc această lucrare să fie utilizate în cadrul Facultății de Informatică. De asemenea, sunt de acord ca Facultatea de Informatică de la Universitatea „Alexandru Ioan Cuza” din Iași, să utilizeze, modifice, reproducă și să distribuie în scopuri necomerciale programele-calculator, format executabil și sursă, realizate de mine în cadrul prezentei lucrări de licență. Iași, 25.06.2018
Absolvent: [anonimizat] ________________________________ (semnătura în original)
of 654
Summary Chapter 1. Introduction ……………………………………………………………………..9 1.a. Context …………………………………………………………………………9 1.b. Motivation ……………………………………………………………………13 1.c. Contributions …………………………………………………………………15 1.d. Organisation ………………………………………………………………….15 Chapter 2. The Map Creation Problem ……………………………………………………16 2.a. Problem Description …………………………………………………………..16 2.b. Existent Approaches …………………………………………………………..16 2.b.1. The Search-Based Approach ………………………………………..16 2.b.2. Interpolated Random Terrain Approach …………………………….17 2.b.3. Gradient-based Random Terrain Approach …………………………18 Chapter 3. Function Map Creation using 2D Gradient Noise: ………………..…………..20 3.a. General Contributions …………………………………………….………….20 3.b. Solution Components ……………………………………………..………….21 3.b.1. The Perlin Noise Algorithm …………………………..…………….21 3.b.1.a. Grid Definition ……………………………………..……..21 3.b.1.b. The Dot Product …………………………………..………23 3.b.1.c. The Interpolation Function ………………………..………23 3.b.2. The Noise Matrix …………………………………………….……..26 3.b.3. The Colour Matrix ………………………………………….………30 3.b.4. Unity Editor Customisation ………………………….……………..32 3.b.5. Mesh Creation ………………………………………..……………..34 3.b.6. Parallelisation ………………………………………………………38 3.b.7. The Merging Problem ………………………………………………41 3.b.8. Dynamic Adaptive Mesh ……………………………………………42 Chapter 4. Solution Evaluation ……………………………………………………………..44 4.a. Visual Experiments …………………………………………………………….45 4.b. The Speed Experiment …………………………………………………………49 of 754
Chapter 5. Conclusion ……………………………………………………………………..52 5.a. Summary ……………………………………………………………………….52 5.b. Improvement Plans …………………………………………………………….52 References …………………………………………………………………………………54
of 854
1.Introduction: 1.a. Context: The functional content building can be generally defined as a method of algorithmic creation of game content with limited or indirect user input, as opposed to the classical, manual approach. The later is much more time consuming and is usually performed by 3D modellers and game designers. A more straightforward definition could be: an implementation of an algorithm that automatically generates a game world. However, it is of utmost importance to understand that this procedure does not completely remove modellers or designers from the game development process. It rather improves the general productivity of the entire team as a whole in the same way in which an automation script helps a system administrator to perform different tasks: it removes the unnecessary repetitions. In spite of that, a major drawback of this is the periodicity of the content creation, an effect that must be reduced if not completely removed because it can affect the overall game atmosphere. The term content refers to the majority of elements that are encountered in a game, for example: the levels entirely, the game map, rules, textures, items, quests, storylines, characters, etc., but not the game engine, the non-player characters behaviour or any artificial intelligence because these are singular research domains, as stated in [1]. Nevertheless, this paper will focus more on the game map building not because the other content components wouldn’t prove a worthy challenge, but because of a personal attraction to geography and geometry. The techniques involved could be used and have been inspired by some other different areas such as physics and signal processing. Below, 2 classifications of the types of content that can be produced and can be found in games (retrieved form [7] and [1]): Classification 1 — By the nature of the game element created [7]: a. Game Bits: elementary units of the game content that do not affect the player’s gameplay when considered independently. Included in this category: textures, sounds, vegetation, structures, behaviours, fire, water, stone, or clouds. of 954
b. Game Space: the environment in which to play. It consists of different units Game Bits. Included in this category: indoor maps, outdoor maps, water bodies such as seas, lakes, or rivers. c. Game System: for example, ecosystems, road networks, urban planning. d. Game Scenarios: flows of occurring events such as puzzle, storyboard, the history, the concept of levels. e. Game Design: rules and objectives. Using this classification, the objective of this paper is to study the challenges of producing game space content and more specifically — outdoor maps. Classification 2 — revision of the above [1]: a. Online – Offline Generation: makes a distinction between generating the content continuously during the game runtime or only during the development stage of the game. The example given is one where the player enters a room and the interior/objects/enemies are created on the spot against one where the room is already created and refined by a game designer before the game is released. The online version clearly has to be compliant with the game rules directly, whereas the offline one is verified and modified with the means of human interaction. b. Necessary – Optional Content: The second distinction refers to the importance of the content, if it is necessary or optional for the particular game. Necessary content is required by players to progress through the game. For instance, dungeons that have to be crossed, monsters that must be defeated, crucial rules of the game, etc., fall into this category. Optional content is everything that the player can choose to not consider, such as weapons or facilities that can be ignored. Another difference between these two variants is the required quality: Necessary content must always be of excellent quality and functionally correct. It is not acceptable to create an unpassable dungeon, incorrect rules or unbeatable monsters if these anomalies make it impossible to continue the game. It is also not acceptable to generate the content that does not have a fair difficulty compared to the rest of the game. On the other hand, for optional content, it is principally allowed that an algorithm also produces unusable weapons or an unreasonable dungeon layout if the player can choose to discard the weapon or leave the dungeon. Clearly, the importance given to a content is dependent on the design of the game and its storyline. For example, the first person shooter Borderlands has a random creation mechanism of weapons, many of which are not useful, of 1054
but analysing these weapons is part of the heart of the gameplay. On the other side, a structure with a simple and low quality level of detail seems artificial and can make no credible setting in a game that as a main objective has visual realism, such as Call of Duty 4: Modern Warfare. It is interesting to note that some content may be optional in a category of games and necessary in others (for example, dungeons). Thus, the analysis of what is “optional” in a game needs to be made on a case-by-case basis. c. Control Degrees: This distinction depends on the type of building algorithm that is used and how it can be parameterised. At one extreme, the algorithm can simply use a randomly generated number as an input; at the other extreme, the algorithm can use as a multi-dimensional vector containing the parameters that specify the properties of the content to be generated. d. Deterministic or Stochastic: This distinction relates to the degree of randomness in the build process. It is possible to conceive deterministic algorithms that either generate the same content with the same input parameters, or different content even with identical input parameters. e. Constructive or Generative-with-test: This last distinction refers to the output of the algorithm. Constructive algorithms generate the content and end their execution, producing the output result. However, it is necessary that there is a form of control in such a way as to avoid the production of unsuitable content. This can be done by operations or sequences of operations, which ensure that it cannot produce non-usable material. A “generative-with-test algorithm” incorporates a mechanism of creation and a test. This means that during the execution of the algorithm, each content instance is tested according to some criteria (which depend on the design of the game). If the candidate fails the test, it is fully or partially discarded and regenerated, and the process continues until the content does have sufficient quality For an overview, we can state that this paper wishes to produce a functional game space — outdoor map — that can be used in either an offline or online procedure, targeting necessary game content, with a medium control degree, deterministic with respect to input parameters and constructive. Also, an important aspect would be where could this procedure be used: and the answer would be in games. However, it is again important to understand that this is not only limited to games, but to any type of application that simulates an environment: a flight or driving simulator, a war preparation exercise or a training program for astronauts that will attempt a Mars landing. However, because this real life applications require a much more conditioned type of environment for obvious reasons, we will keep our attention focused to the game development domain. The area of 1154
of personal training for specific activities that could put the life of the person using this software in a dangerous real-life situation if not performed correctly is not the purpose of this paper and this should be taken into consideration if such an implementation is required. The most important feature of the generated content in our mind is that it should be playable and it should provide an immersive experience for the player. Desirable proprieties of the solution [1]: a. Speed: the requirements for speed can vary from milliseconds to months, depending on the size of the asset generated and wether or not the procedure happens online or offline. b. Reliability: this property is related to the fact that the output of the algorithm can affect in a negative way the general game flow. A reliable solution must be not send the game into an unplayable state. c. Controllability: the property describing the level of control degrees of the solution. d. Expressivity and diversity: there is often a need to generate a diverse set of content, to avoid the content looking like it’s all minor variations on a tired theme. At an extreme of non-expressivity, consider a level “generator” that always outputs the same level but randomly changes the colour of a single stone in the middle of the level; at the other extreme, consider a “level” generator that assembles components completely randomly, yielding senseless and unplayable levels. Measuring expressivity is a non-trivial topic in its own right, and designing level generators that generate diverse content without compromising on quality is even less trivial. e. Creativity and believability: property in relation with the periodicity of the generated content. A solution’s output should look as if it was produced by a human designer and not an algorithm, because this affects the general atmosphere and sensation induced by the game environment to the player. Even though defining a game has proven to be a very complex task, and a complete perspective can be found in [2]. This paper will focus on games as pieces of software produced as means of entertainment for their players and more precisely on a specific part of game environment — “A dimension which collaborates game rules, objectives, subject, and theoretical aspects together of 1254
as a whole to provide an interactive flow of activity.” as stated in [3] — the game map. That is the total space available to the player during the course of completion of a discrete objective. In games with linear progression, this usually represents a location of a large world and it can contain characters to be fought, movement challenges such as jumping or climbing puzzles or any form of obstacle course, [4] but the main focus of the paper will be on the creation of various surfaces and geographical elements such as mountains, valleys or oceans, both in shape and colour. 1.b. Motivation: Three main motivations were the elements that stood at the basis of this paper: a creative one, a financial and managerial one and a personal one. Each of them will be further analysed in this section. The first one, or the so called creative one is related to the fact that humans, programmers, designers, artists or any other profession, even if they try to come up with creative solutions to various problems, usually end up by mixing various examples and coming up with one called their own. It is understood that this is the way creativity works — a process of mixture of existing elements, usually from different domains of activity — but this way of creating content tends to give birth to similar end results. For example, if we take 2 “AAA” games from last year, Call of Duty WW2 and Battlefield 1, except from the storylines, the temporal period in which the games are placed and some mechanics, the gaming experience is pretty much the same to some extent. This is the mere opinion of the writer, and under no circumstance it should be considered an offence to the teams participating in building these franchises. However, the main point is that a solution coming from an algorithmic perspective could be completely new, innovative and somehow reinvigorate the gaming scene. An addition to this could be the possibility to create gaming content at the same time with the player in-game time, this solving a problem long sought after: why the best games we play have to end ? MMOs have tackled this issue in a different manner, by creating islands of storyline content in a very repetitive world, but functional development could also mean the creation of storylines during runtime — a very appealing, but still quite esoteric idea at the moment. The second one is a much more practical argument, being related to the financial and managerial part. Lately, the development costs of large game franchises have increased more than expected. If we are to take a look to what it meant to build an “AAA” game 20 or 10 years ago, we of 1354
would see that today, such a thing is simply not possible anymore for a smaller company because of the resources involved. However, functional content creation could be one of the solutions to this problem, reducing the number of people needed to develop such pieces of software and increasing both the productivity and revenue accumulation. The revenue part happens because usually, the highest part of development costs are due to artists and designers and not programmers — if we do not take into account the marketing campaigns — and the productivity part because of the misunderstandings and disagreements between these 2 departments. A more complex discussion regarding the subject can be found in [5]. To some extent, we could state that programmers have found a way to shift back the scales — metaphorically speaking of course. Below, we attach 2 very explicit statistics regarding this matter, retrieved from [6].
Fig. 1 Fig. 2 The first one shows the costs of various games types based on the number of people that worked during the development phase and the size of the project — on consoles — while the second one is focused on the type of the device targeted. It is also worth noting that the average development time for an actual “AAA” game is somewhere around 1-2 years from the moment it has been announced and 1 or even more years before that. With the facts in hand, it is clear that new companies cannot think about joining the big market without a huge starting budget. However, functional creation algorithms could prove to be just the right solution to changing that. Nonetheless, they should not be thought about as silver bullets that solve all the development problems — it will be shown that this approach has its own set of flaws and challenges — but as another technique in the developer’s arsenal, to be used at the right moment and after a thorough analysis. of 1454
Last but not least, there is a personal motivation. This topic is one of the first advanced areas to which we came across, even before starting university studies. Even more than that, it was related to video gaming, a subject not usually tackled that much from an academic perspective, but clearly an interest domain for ourselves. This great opportunity was provided by a project named FII Practic and it was one of the reasons why we took this path. Later on, during an Erasmus+ Mobility, we further analysed this topic. 1.c. Contributions: 1. We implement a version of the Perlin Noise Algorithm; 2. We construct the noise and colour matrices using Unity specific data structures; 3. We customise the Unity editor with parameters to improve the general usability; 4. We translate the 2D noise matrix into an actual 3D terrain mesh; 5. We add multithreading to build multiple parts of terrain at the same time around an object of interest; 6. We transform the static meshes into adaptive ones with decreasing level of detail with respect to the object of interest to improve performance; 7. We solve the problem of matching different parts of map built at the same time; 8. We generate different test cases for various parameters values and map sizes; 1.d. Organisation: In Section 2, we define the problem of functional map creation and provide an overview of existing approaches, while motivating our own. Section 3 introduces the solution based on Perlin Noise and analyses it in depth and at the same time describes the encountered difficulties. In section 4, various experiments are presented, to show the output in correlation with the variation of input parameters and some regarding performance differences between the single threaded solution and the multi-threaded one. Section 5 draws a conclusion by summing up all the main parts of the building process and also providing a list of possible improvements and directions that could be of 1554
followed up in the future. Section 6 is the bibliography and section 7 is the Appendix, where supplementary materials such as code examples and additional information can be found. Chapter 2. The Map Creation Problem: 2.a. Problem Description: The problem targeted by this paper is the one of creating an outdoor, geographical map of a plausible game world by algorithmic means. The solution should be fast, playable, with a medium degree of freedom and as realistic as possible, without the usage of extremely specific 3D modelling approaches. This somehow generic description could be summed up in this way: how to create areas containing topographies as realistically as possible, with types of terrain of different altitude ? On might think that a trivial approach of lerping the coefficients stored in a 2D matrix can be used and this wouldn’t be extremely far from reality if we do not take into account the size and the believability of result. With these in mind, it is clear that a proper solution should require parallelisation, but this raises another problem, even more complicated: how to create multiple adjacent regions and make the types of landform match? Even more than this, is it necessary for all the map to be created at once, or can it be done only in an area around the player ? Will the player notice if the areas that are far away from him are of a lower level of detail than the ones closer to him ? 2.b. Existent Approaches: Given the fact that the studied problem is a bit general and has applicability in many domains of activity, different approaches have been developed over time. Below, we shortly present these as they appeared in [1], to have a broader view of the research field: 2.b.1. The Search-Based Approach: In search-based functional content creation, a genetic algorithm or some other stochastic search/optimisation algorithm is used to search for content with the desired qualities. The basic explanation is that of design as a search process: a good enough solution to the design problem exists within some space of solutions, and if we keep iterating through many possible solutions and perform additional modifications at each step, keeping those of 1654
changes which make the solution(s) better and discarding those that are harmful, we will eventually arrive at the desired solution. The core parts of this approach are the following ones: an evolutionary search algorithm — the most popular is probably NSGA-II[8] —, a representation of the content and an evaluation function to check if the intermediary solution is better than the previous one. An example of where this approach has been used are the maps in Blizzard Entertainment’s StarCraft:
Fig. 3 Fig. 4
Fig. 5 Fig. 6 2.b.2. Interpolated Random Terrain Approach: to describe why this method was developed and the usage of a random number generator is not enough, we have to think about the nature of Earth’s geography. The altitude of a randomly chosen point on Earth is probabilistically correlated with the altitude of the points near it. In other words, probabilistically, the points near a mountain peak also have high altitudes and the points in the middle of a field have all low altitudes. of 1754
A random number generator cannot produce this type of ascent and descent in gradients on its own; in a best case scenario, it will produce a very spiky map that is not even close to anything in reality. This approach makes use of an interpolation function — for example a weighted average function on 2 directions: if we have a distance between 2 points [0,0] and [0,10] and we need to fill in the height of the point [0,1] we use the formula height[0,1] = 0,9 * height[0,0] + 0,1 * height[0,10] because we want the height curve to change gradually — and applies it during multiple iterations to produce a more smooth transition between points. The core main of the method is the random number generation and the interpolation function. This approach has been further researched and enhanced and is more didactic than suited for production games. However, a good example can be found in Thekla, Inc.’s The Witness — but it’s worth noting that it is an example of offline generation enhanced by designers afterwards :
Fig. 7 2.b.3 Gradient-based Random Terrain: if the interpolation method described above uses a random generator to infer height values and then interpolate the slopes, this technique firstly generates the slopes, and the calculates the height values. We still use a random number generator for initialisation, but this time, 2 gradient values will be stored, (dx, dy), that stand for the slope steepness in the x and y directions. Both positive and negative values are used for mountains and respectively valleys. To recover the height values from this state, there are different approaches that use reference points, grids, or some other sort of mark or multiple marks and do a dot product between a vector drawn from these marks and the gradient vector. Depending on the number of of 1854
marks chosen — usually 4, in 2D — 4 values are produced, and need to be interpolated to produce a singular value, that will be the height. This has a few advantages over producing height values directly, such as the increased rate of smoothness: rather than smoothing the change in heights with an interpolation method, we smooth the rate of change in heights, so slopes grow shallower or steeper smoothly. Also, since the different types of surfaces are not directly generated on actual points of the map, but appear from the direction of slopes, the overall arrangement looks somewhat more organic. A good example, even though with a proprietary algorithm, but in the same theoretically refined idea are the terrain maps from the game Minecraft — this game accepts post-processing after the generation phase, but the focus will be on that one:
Fig. 8 Fig. 9 Other building methods have been tried and are under current research right now such as genetic programming for terrain creation or machine learning techniques but those haven’t been used in any AAA game up to date, mostly because of controllability issues. However, a combination of machine learning and one of the above described methods could be the answer to the desired proprieties of game designers and could maybe lead to fully auto-generative games.
of 1954
Chapter 3. Function Map Creation using 2D Gradient Noise: This paper takes a closer look at the 2D Improved Perlin Noise Algorithm and proposes an implementation with different applicability than its original one, that was intended to create realistic textures. To demonstrate the usability of this approach, multiple Unity scenes and additional mechanism were created for a simulation of world-creating in a computer game. Because on a higher scale and size map, this proved to be problematic with respect to time, multithreading and a dynamic level of detail of meshes was added. Also, to get as possible to how this process happens in a real online building context, the map was creating dynamic during game time around an object of interest — in a “real” situation, this would be the player — and not in a statical manner. 3.a. General Contributions: 1. Personal implementation of the 2D Perlin Noise Algorithm; 2. Building of a Noise Matrix with multiple controllability parameters: width, height, seed, scale, number of octaves, persistence, lacunarity, offset vectors and a normalisation flag; 3. Creation of a colour matrix for believability effect; 4. Customisation of Unity editor parameters to improve the general usability; 5. Mesh creation; 6. Multithreading for building multiple parts of the map at the same time around an object of interest; 7. Solving the problem of matching different parts of map created at the same time; 8. Transformation of the static meshes into adaptive ones with decreasing level of detail with respect to the object of interest to improve performance; 9. Different test cases for various parameters values and map sizes; of 2054
3.b. Solution Components: 3.b.1. The Perlin Noise Algorithm: Perlin Noise is a type of gradient noise — a type of noise that represent slopes rather than direct values — developed by Ken Perlin in 1983 in an effort of improving the looks of computer graphics at that time. It was initially built for the creation of textures, and received and Academy Award for Technical Achievement in 1997, stating that: “The development of Perlin Noise has allowed computer graphics artists to better represent the complexity of natural phenomena in visual effects for the motion picture industry.” [9] The algorithm is usually used as a 2,3 or sometimes 4 dimensions functions, but there is no theoretical limitations to that. The implementation requires 3 steps: 1. Grid Definition (with pseudorandom gradient vectors) 2. Dot Product (between the distance and gradient vectors) 3. Interpolation 3.b.1.a. Grid Definition: In this step, a 2 dimensional grid is defined where each point is a pseudorandom 2 dimensional vector. The assignation of gradients is trivial in the 2D case with the use of a pseudorandom number generator. To remove some of the computational operations needed in the case of recomputing new gradients for each grid node, precomputed gradient vectors can be used. A random seed can also be used if multiple instances of noise are required, but this will be done in the noise matrix building part, and not in the basic algorithm. Below, we attach the software code responsible for the initialisation of gradient vectors. It is important to understand that this procedure will not be used in the algorithm to recalculate the gradients at each passing, but rather we will only calculate a table of them before the actual running and a hash table for referencing. This improves the overall time performance of the algorithm and the space tradeoff for the storage of the table is negligible. To initialise both the gradient vector and the hash table, a static function that is run at the beginning of the class is used: of 2154
Static Initialisation Function:static NoiseLibrary() { CalculatePermutation(permutationVector); CalculateGradients(gradientVector); } Hash Table Calculation:public static void CalculatePermutation(int[] permutationVector) { permutationVector = Enumerable.Range(0, 256).ToArray(); int counter = 0; while(counter < permutationVector.Length) { var s1 = randomNrGenerator.Next(permutationVector.Length); var d = permutationVector[counter]; permutationVector[counter] = permutationVector[s1]; permutationVector[s1] = d; counter += 1; }} Gradient Vector Calculation:static void CalculateGradients(Vector2[] gradientVector) { gradientVector = new Vector2[256]; for (var i = 0; i < gradientVector.Length; i++) { Vector2 gradientValues; do { gradientValues = new Vector2( (float)(randomNrGenerator.NextDouble()), (float)(randomNrGenerator.NextDouble())); } while (Math.Pow(gradientValues.magnitude, 2) >= 1); gradientValues.Normalize(); gradientVector[i] = gradientValues; } } of 2254
3.b.1.b. The Dot Product: If the noise function takes 2 parameters, the next step of the algorithm is to determine into which cell of the grid, the given point is located. For each corner point of the cell, the distance vector between the point and the corner is determined. Then, we compute the dot product between the gradient vector in the node and the distance vector. For a 2D point, 4 distance vectors and dot products will be calculated, leading to a O(2n) complexity with respect to the number of dimensions used. This is the operation that gives the overall complexity of the whole algorithm. 3.b.1.c. The Interpolation Function: The last step of the algorithm is to perform an interpolation between the results obtained above. Because of this, the noise function returns 0 when used on the actual grid nodes. The interpolation is done using a function which has both the first and second derivatives 0 in 0 and 1. Again, because of this, the gradient of the resulting noise function calculated in the grid notes is equal with the precomputed pseudorandom gradient vector in there. Ken Perlin Suggests the following interpolation function: This is the result of taking a generic 5th order polynomial, calculating the first and second derivatives and equaling them to 0 in 0 and 1 and then solve the resulting system. Derivatives calculus: Applying the 0 and 1 values at endpoints for the function and both derivatives: of 2354
We solve this system by applying a Gauss Elimination Algorithm with inverse substitution written in Python, and the result is this:
Results: It can be seen that the resulting coefficients are the exact same ones from Perlin’s interpolation function, this concluding our demonstration. In addition to this, we want to produce values in the interval [-1,1] since we need also terrain areas bellow the “sea” level, so the final value will be clamped and multiplied by a scaling factor. Below, we include the Interpolation function and PerlinNoise function code that uses elements from the functions described above. This procedure produces the final value that will be used to further create the noise matrix. Interpolation Function:static Double Interpolate(float t)
{
t = Math.Abs(t);
if (t <= 0)
return 0;
else if (t >= 1)
return 1;
else
return 6 * Math.Pow(t, 5) – 15 * Math.Pow(t, 4) + 10 * Math.Pow(t, 3);
}
of 2454
As we have seen above, the complexity of the above algorithm is O(2n) complexity with respect to the number of dimensions used, which in our case are 2. Even though the initial result might look scary, it is not this complexity that creates problems in implementations, but rather the huge number of times it has to be used to create any sort of map. Later, in our implementation, we will take this into account and try to parallelise this process as much as possible while at the same time, use it only in a neighbourhood of an object of interest . To conclude this section regarding the Perlin Noise algorithm, we chose this approach because of the improved degree of believability it produces with respect to terrain creation. A more detailed description and also further research on the topic can be found in [10]. Perlin Noise Function:public static float PerlinNoise(float x, float y)
{
Vector2 cellCoordinates = new Vector2((float)Math.Floor(x), (float)Math.Floor(y));
double finalValue = 0f;
Vector2[] cornerCoordinates = new[] { new Vector2(0, 0), new Vector2(0, 1), new Vector2(1, 0), new Vector2(1, 1) } ;
foreach (Vector2 corner in cornerCoordinates)
{
Vector2 ij;
ij.x = cellCoordinates.x + corner.x;
ij.y = cellCoordinates.y + corner.y;
var distanceVector = new Vector2(x – ij.x, y – ij.y);
var index = permutationVector[(int)ij.x % permutationVector.Length]; index = permutationVector[(index + (int)ij.y) % permutationVector.Length];
var gradientVectorNode = gradientVector[index % gradientVector.Length];
finalValue += InterpolationProduct(distanceVector.x, distanceVector.y) * Vector2.Dot(gradientVectorNode, distanceVector);
}
return (float)Math.Max(Math.Min(finalValue, 1f), -1f);
}
of 2554
To make the comparison with the other option considered, namely value noise, the following graph will be added, taken from [11] and showing the difference between the 2 of them. Fig.10 3.b.2. The Noise Matrix: The noise matrix is the base on which this functional creation method works, a 2D representation of the terrain surface. In plain words, a noise matrix is nothing else than a 2 data structure filled with values returned by the Perlin Noise function and modified by an additional set of parameters. To understand more about how this parameters affect the overall result, we firstly need to define a set of terms[12]: 1. Amplitude: the amplitude of a function represents the distance from the midpoint to the highest or to the lowest point of the function. In other words, it is the maximum absolute value that a noise function can output. 2. Period: the period of a function is the distance between any 2 repeating points of the function. 3. Frequency: The number of cycles per unit length that a specific coherent-noise function outputs. 4. Noise Octave: A sampled noise function in a series of multiple noise functions added together to form a more complex result. By default, the resulting function has a doubled frequency. This property controls the level of detail in the Perlin Noise but also increases the calculation time. 5. Persistence: Determines how quickly the amplitudes diminish for each successive octave. The amplitude of successive octaves is a product of previous amplitude and persistence values; of 2654
6. Lacunarity: Determines how quickly the frequency increases for each successive octave. The frequency of successive octaves is a product of previous frequency and lacunarity. It basically represents for frequency what persistence is for amplitude, but with respect to the increase and not diminish of frequency. For a detailed description of how the noise matrix is being constructed and introduced into the Unity viewport, we need to understand the linking between the NoiseMatrixBuilder, MapBuilder, TerrainBuilder a customisation of the editor and the way an Unity object can support scripts on itself. This mechanism has been implemented using a variation of Builder, Command and Chain-of-Responsibility design patterns; a workflow diagram can be seen below:
Fig. 11 As seen above, the custom “Build” button from the editor creates the command the is sent to the builder MapBuilder. The MapBuilder is responsible with the actual creation of the complex “map” object, and is fulfilling this by assembling the responses from the GenerateMapInfo and BuildTexture methods. Further down the chain of command, the GenerateMapInfo method builds a MapInfo object comprised of a noise matrix and a colour matrix. This will be also explained in a more detailed way in the colour matrix building section; for now, it is of interest the noise matrix which is created by the NoiseMatrixBuilder, in the BuildNoiseMatrix method. Here, all the of 2754
parameters described above are used to model the result retrieved form the Perlin Noise algorithm and also some validations and normalisations happen. However, if the Perlin Noise had a theoretical basis, the operations that happen at this step are based on visual responses in the hope of producing a realistic visual representation. Three important processes happen inside the BuildNoiseMatrix method: the initialisation of the octaves vector with the addition and subtraction of the offset parameter — this parameter will prove extremely important in solving the problem of merging map parts during the parallelisation stage — , the construction of the actual noiseValueMatrix taking into account the octaves sampled above and the normalisation of the matrix to eliminate values outside the wanted range. The initialisation of the octaveOffsetsVector is done with the help of a random number generator, with the addition of the x component of the offset parameter and the subtraction of the y component. This happens because the map moves from W to E if the x component increases and from N to S if the y component increases — it is only a convention and can at the same time be done by only additions or subtractions. Also, during this step, we estimate the maximum possible height that can be reached by summing up amplitude values, starting from 1 and getting modified by their product with the persistence value. Since the persistence value is in the [0,1] interval, this is a more exact approximation than using the value 1 each time and will help again at the merge of map marts. To construct the noise matrix, we need to choose 2 sample points to feed to the PerlinNoise function. This is being done by looping between the height and width values, divide by a scale value for additional control, multiply by the frequency and adding the value from the octaveOffsetVector. In addition to this, a subtraction of half width and half height values removes a zooming effect observed at the change of the scale value. The effect was not necessarily bad, so we moved it from the right corner of the generated noise matrix representation to the centre of it. These 2 obtained values are then fed to the PerlinNoise function, but the result obtained is placed in the interval [0,1]. Because we need also negative noise values, we multiply by 2 and subtract one, to change the interval to [-1,1]. The relevant code part can be seen below: float xValue = (index1 – (width / 2f)) / mapScaleValue * frequencyValue + octaveOffsetVector[index3].x; float yValue = (index2 – (height / 2f)) / mapScaleValue * frequencyValue + octaveOffsetVector[index3].y; perlinValue = Noise2d.PerlinNoise(xValue, yValue) * 2 – 1; of 2854
Finally, 2 types of normalisations are used: Local and Global. The local one should be used if the entire map is generated at once since if this is the case, the min and max noise values are known and a simple interpolation between this 2 margins is possible. The second case should be use if the map parts are generated in a parallelised way, and the maximum estimated height described above comes into place. The formula used is estimated to some extent, but it provides a fairly good visual result. Firstly, since the value incoming from the PerlinNoise was moved to the interval [-1,1], this needs to be shifted back, so 1 is added and 2 is divided. Furthermore, to get a normalised value, we divide the result by the estimated max height, but the result obtained produces many values close to 0 and only a few close to 1. To counter this effect, we try multiple multiplication values such as 1.1, 1.4, 1.5, 1.6, etc. and we finally settle at 1.9 because the provided result gets visually pleasing enough. We feel this are could be further researched, but for now, we were aiming for a final result as believable as possible and we didn’t feel that a somewhat 90% underwater terrain would prove any additional point. However, this formula can be modified, depending of the type of the developed game. Below can be seen 2 examples of rendered noise matrices, with and without the multiplication with the 1.9 factor:
Fig. 12 Noise Matrix normalised with the 1.9 factor Fig13. Noise Matrix normalised without the 1.9 factor As it can be clearly seen, the values produced in the first case are much more diverse from black to white whereas in the second case, the rendered matrix is much more darker. With respect to terrain types, this translates to pretty much underwater terrain in the second case and a much more well rounded representation in the first case. Therefore, the addition of the multiplication factor is justified, even we have found no theoretical basis for such a formula. To conclude, we want to make sure that in no case, the value produces gets out of the interval [0,1], so we clamp this value . of 2954
The code bits responsible with the 2 normalisation types, as explained above:
3.b.3. The Colour Matrix: Until now, we have been building and developing the theoretical and algorithmic side of the solution, but from this step, it is time to get some of the so much needed visual results. For this to happen, we remain in the two dimensional area for now, but we start to add colours to our not yet built terrain. First of all, we create a class Terrain that will represent the type of terrain to be painted. It has to be kept in mind that the creating of the terrain and the addition of colour are two different processes. We also initialise an array of Terrain types from a customised editor — we will talk about this more in the Unity customisation section — panel: 2 water regions with values in the intervals [0, 0.16] and [0.16, 0.24], a sand region in the range [0.24, 0.3], 2 grass regions, [0.3, 0.46] and [0.46, 0.6], 2 mountain regions [0.6, 0.8] and [0.8, 0.96] and a high mountain peak area between [0.96, 1]. Again, this values have been chosen after visual observations, and there is no practical limit with regard to the number of terrain regions to be chosen. If we take a look at Fig.11, the next step happens in the GenerateMapInfo() method of the MapBuilder class. As we said in the previous section, the object returned is composed of a noise Local Normalisation Mode:noiseValueMatrix[index1, index2] = Mathf.InverseLerp(minNoiseValue, maxNoiseValue, noiseValueMatrix[index1, index2]);
Global Normalisation Mode:float normalizedHeight = (noiseValueMatrix[index1, index2] + 1) / (2f * maxHeight / 1.9f);
noiseValueMatrix[index1, index2] = Mathf.Clamp(normalizedHeight, 0, int.MaxValue);
of 3054
matrix and a colour matrix. The noise matrix part was already described, but the colour matrix if being constructed by iterating through the noise matrix and finding into which region of the initialised Terrain array each value fits. Once that has been found, the getColour() method of the Terrain class is being called for each value, and a similar matrix to the noise one is being built, but instead of noise float values, there are instance of Unity’s Color class. The code responsible for this operation can be seen below:
The difference in storage between the noise matrix and the colour matrix is that one is stored as a bidimensional array and the other as a one dimensional array. This is why in the code above, we address the colour matrix with index2 * mapPartSize + index1. Once the colour matrix has been built, in a similar matter to the noise matrix again, the method BuildTexture() from the class TerrainBuilder is being called, but this time with the parameter TextureBuilder.GetTextureColourMap(), mirroring the same process as in the noise matrix with the TextureBuilder.GetTextureHeightMatrix(). This is again a good example of the command design pattern, because the TerrainBuilder object receives a complex TextureBuilder object and depending on the method called by that object, it builds the map in the black to white noise manner or in a colour oriented manner. The actual texture painting is being done in a Unity specific manner, using a Texture2D object, setting different parameters such as the filter mode to Building of the Colour Matrix:Color[] colourMatrix = new Color[mapPartSize * mapPartSize];
for (int index2 = 0; index2 < mapPartSize; index2++){
for (int index1 = 0; index1 < mapPartSize; index1++) {
float actualHeight = noiseMatrix[index1, index2];
for (int index3 = 0; index3 < terrainTypesArray.Length;index3++) {
if(actualHeight >= terrainTypesArray[index3].getHeightValue()) {
colourMatrix[index2 * mapPartSize + index1] = terrainTypesArray[index3].getColour();
} else {
break;
}
}
}
}
of 3154
point and the wrap mode to clamp for a more natural look and then setting the object’s pixels to the actual colour matrix and apply it. This is a Unity specific process and will not be exactly presented here. Below, in Fig. 14 and 15, 2 examples of colour maps are shown, being created with the same array of parameters excepting the random seed:
Fig. 14 Colour Map 1 Fig. 15 Colour Map 2 3.b.4. Unity Editor Customisation: One of the main reasons why Unity was chosen as a working environment for this paper was the ability to customise it’s built-in editor in various ways to increase both work productivity and ease of use. Since the software solution is mostly based on visual feedback and multiple minor tweaks that improved the end result, it was extremely important to offer some sort of mechanism to control variables in the code visually. Changing a variable’s value in the code multiple times, saving and then restarting the entire application is not a very pleasant process and we tried to avoid it as much as possible. The 1st major advantage of Unity’s customisability was, as stated above, the ability to visually use and present script parameters out of the code and into the visual inspector. Those were transposed not only as empty fields reminding of web forms but also in ranges, animation curves or tick in bars. Below, we can see the difference between the variables in code and the show-up in the inspector: of 3254
Fig. 16 Inspector View Fig. 17 Code View The “Build” button in Fig. 16 was created in another script, specifically created for the customisation of the editor. In the same script, custom functionalities were applied to the “Build” and “Automatic Update Option” — they both run the BuildMap() method from Fig. 11. This process is somewhat similar to the Angular JS directive ng-click. Addition of Functionalities to Editor Buttons:[CustomEditor (typeof (MapBuilder))] public class BuilderEditor : Editor { public override void OnInspectorGUI() { MapBuilder mapGen = (MapBuilder) target; if(DrawDefaultInspector ()) { if(mapGen.automaticUpdateOption) mapGen.BuildMap(); } if(GUILayout.Button ("Build")) mapGen.BuildMap(); } } of 3354
The choosing of the colours and range values for the colour matrix and the differentiation of terrain types can be seen in Fig. 17. Again, all the changes done in the Inspector are persistent, and for example, the colour choosing was much more convenient this way. This way, the final result can drastically change very fast, but still maintaining the understandability. In a situation where another person would use this mechanism, the changes would happen intuitively, and since the Unity Asset store trades this type of solutions, it is also important for the solution to be understandable, not only functional. Fig. 17 These are not the only modifications brought to the Unity editor, but since the customisation of Unity is not the entire purpose of this paper, we will try to keep the level of detail regarding this topic to a lower bound. For additional information on the topic, we found [13] very useful and well detailed. 3.b.5. Mesh Creation: This section will describe the translation from a 2D noise/colour matrix applied as a texture to an Unity plane object to the construction of an actual 3D mesh. Because of Unity’s internal work mechanism, the same texture can be applied to the 3D model and will create a coloured representation of a map part. For the actual mesh building part, we will use triangles and not quads because the game engine transforms any quad into 2 triangles at the start of the game mode. We want to avoid all these extra calculations; this operation can also be not very pleasant at times, because there is no of 3454
control into how the quad will actually be split. Also, the triangle rasterisation is usually supported directly into hardware. Because of the way Unity’s Mesh object creation works, an array of vertices and an array of triangles will be required. To understand the process, we first need to explain what is the connection between the triangles and the vertices and how are they being built. For example, if we take the following representation of a 3×3 vertices matrix — in our application, this will not be the case, but for the sake of explanations, a matrix is more suitable than a 1D array — [[1,2,3][4,5,6][7,8,9]], the following triangles can be formed when iterating it: [1,5,4], [5,1,2], [2,6,5], [6,2,3], [4,8,7], [8,4,5], [5,9,8] and [9,5,6]. Fig. 18 shows more precisely how triangles are formed between vertices:
Fig.18 Triangle Formation To be able to construct these data structures, we need to know their exact dimensions from before building. The number of vertices is easy to calculate, being equal with the product between the width and the height of the map; this is due to the fact that Unity calculates the distance between points in the viewport with a standard distance of 1. To calculate the number of triangles, we decompose the figure, firstly calculating the number of squares. This is easier to calculate: taking into account that between 2 points we can draw a line and we need 4 points to draw a square, 6 points to draw 2 squares, etc., leads to the number of squares being: (width – 1) * (height – 1) or (the number of points on the 1st axis -1 ) * (the number of points on the 2nd axis – 1). As we have seen in Fig. 18, any square can be decomposed into 2 triangles — this is specifically the reason why we of 3554
did not use quads for the mesh — and any triangle is composed of 3 vertices, the formula for the size of the triangles array is (w – 1) * (h -1) * 6; Since we want our mesh to be perfectly aligned, we need to find the coordinates of our top leftmost point. In Fig. 18, that would be 1, and if we consider 5 the centre of our plane — with coordinates (0, 0), 1 would have the coordinates (-1, 1). The formulas for obtaining these coordinates are: width_coordinate = (width – 1) / -2 and height_coordinate = (height – 1) / 2, specifically because while going to the leftmost point direction, the width coordinate decreases but the hight coordinate increases. With these pieces of information known, we can finally start iterating through the noise matrix and place the triangles into the triangle array. From Fig. 18, we see again that vertices 3,6,7,8 and 9 are not starting points for any triangle, so, to avoid any additional triangles being drawn, we will not visit those. At the passing through each vertex — respectively 1,2,4,5 in this case; this can also be seen as a big time optimisation, since the number of visited points is drastically reduced —we want to build the 2 triangles that have that point in their composition: (index, index + width + 1, index + width) and (index + width + 1, index, index + 1). To complete the information required by Unity to construct the mesh, we need an array of percentages between 0 and 1 showing the linkage between each vertex and the map width and height. This is being done also in the iteration of the matrix step, by constructing a new Vector2 of index divided by width and index divided by height for each noise value. Without this, it would not be possible to apply textures to the mesh. Finally, going back to Fig. 11, we call the methods BuildMap and GenerateMapInfo, but inside the second, we feed our TerrainBuilder with both a MeshBuilder object and a TextureBuilder. The MeshBuilder returns a MeshInfo object with the information described above — specifically the arrays of vertices, triangles and the percentages for textures — and the TextureBuilder returns the needed texture. Again, this is a perfect example of the usage of the mix between the Builder and Command design pattern, decoupling the object that calls the operation from the one that performs it, and also accepting different “commands” as invoking parameters. This way, multiple complex objects can be built and assembled without the knowing of the actual caller. The end results of the mesh construction process can be seen in Fig. 19 and Fig. 20, with and without an applied colour texture: of 3654
Fig. 19 Mesh with coloured texture Fig. 20 Mesh without coloured texture From Fig.20, it can be easily observed that the areas with water are not as sharp in depth as the mountainous areas. A specific Unity component named Animation Curve has been used to obtain this effect since otherwise, there wouldn’t have been enough close to flat looking terrain. This is once again a sort of visual fix, since the algorithm itself produce probabilistically values with increasing or decreasing gradients and not that many values with the same gradient. Also, in section 3.b.2, we have described 2 different methods of normalisation that move the noise values back to the interval [0,1]. If we remove these 2 mechanisms, the looks become much sharper, losing some of the believability effect. Also, since we only apply textures and not complex object for the water area, the final looks would not suit our need. The creation of effects that simulate water is a specific topic outside our research are, and for now, we feel that this way of representing the water, even without the proper depth is, visually speaking, realistic enough, but further research could be taken into this area. A problem raised by this method of mesh creation is that Unity doesn’t allow the creation of meshes that contain more than 65535 vertices. This means that the maximum width * height values of a game world are somewhere around 255 * 255. We know that game worlds have a tendency of taking hours and hours to explore, so such a value is extremely small, even for a simple game. Even if a bigger mesh object could be created, it would still be impossible to build it at once in a reasonable amount of time with the average computer’s processing power. With the end of this section, we can conclude that it is possible to build game world in an algorithmic way. In the next sections, we will explain our approach into solving the problem of the size limitation, what problems arise with the solution and further implement a mechanism used in today’s industry to improve the building time of the entire map. of 3754
3.b.6. Parallelisation: As we started to explain at the end of the previous chapter, the problem ahead seems to be ideal for parallelisation. A working implementation in this direction could be way more faster because of the internal workings of the GPU and remove the vertices limit imposed by Unity. The removal would be positive in nature, since a single threaded implementation would not work properly even without it, in a real game context. Another limitation we have to take into account while developing our solution is the fact that Unity does not allow meshes to be built or textures to be applied outside of its main thread. The proposed workaround is to use threads to calculate the necessary information for noise matrix, colours, meshes and feed all those to a queue, and use the Unity built in start and update methods to control the main thread into constructing the visual representations. From a distant perspective, this idea was inspired by how the instruction planner works in an operating system. The data queues holding the above information need to be locked, because different threads could try to access them at the same time, leading to a concurrency problem. In a way, we could say that it is a reinterpretation of the producer-consumer classical problem, with Unity’s main thread as the consumer and all the other threads as producers. Before continuing with detailed explanations regarding the threading mechanism, we need to say a few words about the start() and update() specific methods of the standard MonoBehaviour class. The start method, as the name suggest is being run at the start of any script and is usually used for initialisations or running any type of script at the beginning of the game. The update method, on the other hand is being continuously run frame by frame and can be used for calculations that concern the position of moving objects, but not only this. Multiple update functions can be run at the same time on the main thread, since any Unity script can have it and multiple scripts can run in parallel. In the threading mechanism explained later in this section, the update method will have a very important role in the consumer part. In Fig. 21, a flow diagram with the dependencies between method calls can be seen. We consider that this is the best way of representing the threading mechanism, mostly because Unity programming is not primarily concerned with class segregation, but rather with the applying of various scripts to objects directly in the viewport. With this in mind, we focus on the functionality of the threading mechanism. Its structure was built into this shape because of constraints imposed by Unity’s internal workings. of 3854
Fig. 21 The Threading Mechanism Firstly, we can see that the methods Start and Update are both calling UpdateMapPieces, which is a function responsible with checking if in a defined area around an object of interest, a new map piece should appear. This object of interest is simply an Unity empty object that will be dragger around the viewport to simulate a player. Start calls this method because we want this behaviour to happen right from the beginning of the scene and Update is used to check frame by frame if additional parts should be created or if any map part is too far away from the player and needs to be cleared. However, cleared in this context means becoming invisible and not getting completely destroyed. When the player returns to that spot, we don’t want to recreate the piece, but only to set its visibility to true. This is a tradeoff between storage and building speed and for the sake of a smoother functionality, we choose speed. Nevertheless, for each determined piece to be built, the MapPiece constructor is being called, initialising the coordinates of the new map piece and its bounds — which is a specific Unity property for delimiting objects so that they do not intercalate. However, the constructor does not know anything about the noise or color matrices so it calls the GetMapInfo method. Practically, this is the base of the multithreading process, since UpdateVisibleMapPieces can determine multiple of 3954
parts to be built so multiple constructors will be called in different threads. However, GetMapInfo is nothing but a thread starter with a pointer to MapInfoReceived as a parameter and the coordinates of the map centre; this method must be run from the main thread. As we said, GetMapInfo simply starts a thread, delegating to the MapInfoThread to be more precise. MapInfoThread does two important things: firstly, it calls GenerateMapInfo, a function whose functionality has been described in 3.b.2 and 3.b.3, which returns the actual noise and color matrix. Secondly, it stores this information and the callback pointer to MapInfoReceived in the MapInfoStorage queue. At this moment, the functions terminate, but another Update method from the MapBuilder class also checks frame by frame if anything has been placed in the queue. If anything is found, the update function runs the method pointed at by the callback pointer with the noise and colour matrices as parameters. Once again, the command design pattern can be seen in action here. Since all the callback pointers are to the MapInfoReceived, nothing else will be run. The MapInfoReceived builds textures and the runs UpdateMapPiece, starting the mesh building process. The mesh is being built structurally in the same way as the noise and colour matrices, using a queue for storage of the mesh information — as presented in 3.b.5 — built by different threads and callbacks to a function that initialises the actual mesh and finalises the entire operation. A very important observation is to make sure that a concurrency problem does not appear at the two storage areas. In Fig. 22, an example of the concurrency issues can be seen.
Fig. 22 Concurrency Issue Spikes of 4054
The example in Fig. 22 is a very interesting one, because the threads did not change any pointers, but only height/colour values, producing the spikes effect. However, with the concurrency issue not addressed, Unity usually crashes because of a bad memory addressing error. Another important observation can be done based on Fig. 22: the built map pieces do not properly match. Mountains drop fast at the border of the piece, seas turn to fields in the neighbourhood, it is crystal clear that this is not a completely functional prototype. We are not completely pleased with the fact that we didn’t manage to completely parallelise the mesh and map info building process, but those two were interconnected, since a mesh cannot be built without the map information. Again, this section could be further researched to create a more optimised mechanism, but in the next one, we will focus on the merging of the map parts. 3.b.7. The Merging Problem: As shown in Fig. 22, the different map parts do not align perfectly. Initially, we considered a solution involving the parsing of all the map parts’ noise matrices that have already been created by the threads and apply an additional interpolation function between the values on the border. However, this approach introduces additional calculations, making the multithreading somehow obsolete, since not only the noise values need to be recalculated, but also the colours and meshes. In contrast to this, we found a pleasing workaround by using the 2 offset values as parameters to the building of the noise matrix. In section 3.b.2. we talked about the calculations of the 2 sample points that need to be fed to the PerlinNoise function. In the formulas defined then, the offset values were added after the division with the scale parameter and the multiplication with the frequency. By moving these values before the division, the values will also be affected by the scale and frequency; if we also feed the proper offsets to the map building threads, this ensures us the produced result will look nearly continuous. The only remaining issue was the local interpolation function — again described in 3.b.2 — that took into consideration 0 and 1 as the minimum and maximum values between which the lerping took place. However, this was not the case in all the situations, so the global interpolation function — again described in 3.b.2 — was introduced, and produced much better results based on the maximum estimated height. With these 2 fixes in place, the final representation looks much more fluent. Fig. 23, exemplifies this: of 4154
Fig. 23 Fluent maps merged together As we can clearly see, the mountain from the selected map part continues smoothly both to the E and to the S part, the meshes being correctly merged. This problem was raised because of the implementation details and so was the solution to it. It is known that a lot of research into the area of fitting and merging geometrical structures together, but we are happy to have found this method to continue the development process. Clearly, a much more complex solution could be used here to also address the nature of the terrain clustering on the entire built surface. 3.b.8. Dynamic Adaptive Meshes: To conclude the chapter describing the implementation details, we propose a mechanism to transform meshes dynamically, such that the meshes closer to the object of interest have a higher number of triangles then those at a bigger distance. Multiple advanced techniques for reducing the quality of the mesh can be found in [14] , but we choose a much more simpler approach, since mesh reduction is not the intended topic of our work. Even though we lose a lot of information about the actual mesh, in the construction stage — described in detail in 3.b.5 — when we iterate the noise matrix to introduce triangles in the triangle array, we can increase the loop index jump step. If we do this in our example from Fig. 18, by doubling the jump index, we will build triangles between [1,6,4] and [6,1,3]. This increases the area of 4254
of the triangles, but also some of the points can remain outside of consideration. This could lead to degenerate triangles, as can be seen in Fig. 24:
Fig. 24 Degenerate Triangles Fig. 25 Highest Mesh Resolution
Fig. 26 Lower Mesh Resolution Fig. 27 Lowest Mesh Resolution Because we increase the jump step in the repetitive loop, we also need to change the formula for the calculation of the number of vertices. We are not entirely sure that the following formula is correct in all the situations, given the fact that we managed to obtain degenerated triangles, but we use it nonetheless, since it seems to be functional in most of the cases: number of vertices equals the width of the map minus 1 divided by the repetitive loop step plus 1. Also, since the vertices per line number is no longer the same with the width of the map, we need to change the parameters in the triangle addition method. The values for the loop step, or the mesh simplification value as we call it is being initialised in the Start method, because we can and need to calculate these values before the actual map creation. They are then used at the mesh initialisation step, at the end of the threading process. The final result can be observed in Fig. 28: of 4354
Fig. 28 Adaptive Mesh Building As we can clearly see, the meshes in the middle of the image, closer to the object of interest that has the actual centre coordinates, have a greater level of detail than the ones closer to the borders. This effect has been inspired by multiple AAA actual games that use it to decrease the graphical processing power needed to run the game, based on the fact that a player requires a higher resolution only in it’s neighbourhood. Chapter 4. Solution Evaluation: In the following section a few experiments have been performed to have an overview of the implemented solution. Given the visual nature of the topic studied, many of them are focusing on the controllability of the produced map and involve parameters change to highlight different aspects of the terrain. However, since the threading mechanism has been design precisely to diminish the running time of the solution, some data regarding execution times are also presented. An important thing to take into consideration is that all the experiments carried bellow have been carried out manually, so the size of the samples is not particularly large. Nonetheless, since no of 4454
meaningful differences have been spotted, we are tempted to state that the results produced hold a high validity factor. 4.a. Visual Experiments: To carry out this category of operations, we first need to define a set o default parameter values to be considered as a standard. Most of them are involved in the noise matrix building step. We came up with these after multiple creation processes and we consider the end result to be acceptable from the credibility of looks point of view. These are: 1. ScaleValue: used to divide the values used in the PerlinNoise function for more controllability. (default value = 64) 2. MeshMultiplyValue: the mesh produced by unity is nearly flat, but holds the right height proportions between the vertices. Because of that, each height value needs to be multiplied with a factor. (default value = 45) 3. AnimationCurve: used to flatten some of the values below 0 since we don’t use specific water effects. (default values = [ -0.117, 1] ) 4. NumberOfOctaves: number of times different noise functions results are composed and the length of the offset vector. (default value = 10) 5. PersistenceValue: Determines how quickly the amplitudes diminish for each successive octave. Since persistence varies between 0 and 1, we choose the centre of the interval. (default value = 0.5) 6. LacunarityValue: Determines how quickly the frequency increases for each successive octave. This value is not upper bounded, but higher values destroy Unity’s texture. (default value = 2) The following image in Fig. 29 represents a map piece built with these standard parameter values and a random seed of 10 for the pseudorandom number generator — we did not include this value in the standard parameters list, since it completely changes the map without any control degree; also the colours chosen are strictly based on design preferences and will not be considered. However, the value thresholds for colours can be found in section 3.b.4. of 4554
Fig. 29 Map Part produced with standard parameters The next 2 images, Fig. 30 and Fig. 31 depict the terrain changes produced by a variation of the scale value. The sharpness of the regions is drastically affected. Fig. 30 ScaleValue = 85 Fig. 31 ScaleValue = 20 For a mesh multiplying value example, we reset the scale value to standard, so that the effect is clearer when compared with Fig. 29.
Fig. 32 MeshMultiplier = 20 Fig. 33 MeshMultiplier = 100 of 4654
The animation curve affects the end result visually after all the values have been calculated. It is an editor effect and we use it mostly for the water surfaces effects. An animation curve with the values between the complete [-1,1] interval range would not look realistic or be usable in most games as we see it. As Fig. 34 shows it, the values under 0 — the point with coordinates (0,0,0) is the start point of the coloured axis — are probabilistically around the same number as the ones above 0 and should be filled with a complex water texture. We know that in the real world, this might actually be the case, but we don’t think that such a map would be playable.
Fig. 34 Animation Curve with [-1,1] values Regarding the number octaves sampled, this is probably one of the most influential parameters since it directly affects the sampled points values, and also deals with the number of random values generated, providing the true variety of the map. However, as we will see bellow, if the number of octaves drastically influences the looks at first, after a certain value is reached, the effect is no longer that visible and the additional computations don’t make the tradeoff viable.
Fig. 35 NumberOfOctaves = 1 Fig. 36 NumberOfOctaves = 2 of 4754
Fig. 37 NumberOfOctaves = 3 Fig. 38 NumberOfOctaves = 4 Around 5 octaves used, the looks seem to stabilise and even if we add many others, it does not affect the ending result in a visible way. This can be seen in Fig. 39, where we use 20 octaves to prove our statement. Fig. 39 NumberOfOctaves = 20 To finalise our visual representation experiments, we provide examples of different persistence and lacunarity values. However, even small variations from the standard parameter list provided can make the map unplayable, so we question ourselves if we should make this parameters constant or not. For the sake of the diversity of the possible representations, we will not do that because we might have overlooked some case that our user will not. We keep in mind that persistence controls the amplitude and lacunarity controls the frequency of the function.
Fig. 40 PersistenceValue = 0.33 Fig. 41 PersistenceValue = 0.73 of 4854
Fig. 42 LacunarityValue = 1 Fig. 43 LacunarityValue = 6 At the end of all these visual experiments, we consider that our map building system offers a high degree of controllability without affecting its diversity and is reliable if used with the right parameters. However, when we think about believability, it is obvious the further improvement in required, especially in the texture and material sectors. Nonetheless, this is not necessary related to programming, but more likely to design and modelling, showing in this way the limitations of this sort of system. 4.b. The Speed Experiment: In the paper’s introductory chapter we described the desired properties of our solution. Nearly all of them have been addressed in the section before, except one: speed. This was taken into account during development — the other reason for developing the threading system excepting the number of vertices limitation —but now, the time has come to show why this proposed solution is better than just creating our representation with a singular thread. To prove our point, we propose a simple experiment: building of a 240×240 map single threaded, building 4 120×120 map pieces using 4 threads and building 16 60×60 map pieces using 16 threads. In this way, the same amount of terrain was created, the elements varying being the number of threads and the size of each map piece. The standard parameters list defined at the beginning of the chapter was used, the most important parameter with respect to the number of computations being the number of octaves. Since the number of initialised threads is not directly initialisable, we changed the maximum distance threshold values calculated from the central point until we got the desired number of threads (50-100-150). Also, the numbers of 1, 4 and 16 threads were used, because they were small enough and we did not want to interfere with the limit of threads initialisable by Unity at the same time. Needless to say, we crashed Unity a few times in the process. of 4954
This experiment was carried out manually 150 times, leading to the following values in seconds: 1Thread Solution
240×240 Map Building Time4 Threads Solution
120×120 Map Building Time16 Threads Solution
60×60 Map Building Time0.6490.1920.1190.7280.2430.0820.6730.2670.1030.7320.2470.1190.7380.2940.0630.7890.2440.0880.6790.2630.0920.8040.2820.0810.6890.2950.0730.6790.2290.0920.7180.2670.0910.7840.2090.0970.6430.2940.0990.7490.2020.0600.7730.2690.0850.7750.2590.0730.6800.2990.1160.7960.2920.1120.7270.2630.1130.8060.2200.0540.6260.3000.0550.7090.2950.0580.6270.1930.1060.8070.2800.0820.7420.2730.1180.6850.2040.0630.7600.2020.0740.6440.2500.0670.8050.2710.0640.7780.2040.0890.7060.2910.0630.6580.2380.1060.7440.2150.1170.7800.2320.1110.7350.2850.1090.8090.2300.0730.7960.2810.0890.6350.2930.0640.6280.2370.0910.7450.1920.0830.7890.2730.0990.6410.2620.1170.8080.2550.1090.7980.2560.1170.6300.2740.0620.6660.2720.0570.7030.2670.1170.6240.2140.1090.7810.2540.0920.7660.2960.095 of 5054
To make the time improvement even clearer, we construct the following graph that shows us a big improvement between the 1 and 16 threads solutions. However, the difference between 4 and 16 threads is not that big, result in accord with Amdahl’s Law. It is clear that a further addition of threads will not necessary bring a much faster solution, also because the time needed for thread starting and stopping will become relevant.
With these results in hand, we can finally claim that our solution partially suffices the wanted proprieties. However, from this last experiment, we saw that the difference between 4 and 16 threads is not that big, making us wonder what might happen in a much greater thread number. Because of the nature of how this tests were performed — manually — we did not do this last test, but we take the liberty to believe that, for example, an 100 threads solution will be slower than a 4 threads one. With this in mind, we think that a limitation for the max number of possible starting threads should be placed somewhere around the number 32. Nevertheless, we did not implement such an improvement. of 515400,2250,450,6750,9
1 Thread Solution
4 Threads Solution
16 Threads Solution
Chapter 5. Conclusion: 5.a. Summary: This paper aims at solving the problem of algorithmically building an outdoor 3D map as realistic as possible and without using specific modelling techniques. It focuses on building a two dimensional matrix of heights by a personal implementations of the Perlin Noise algorithm and a colour matrix. We also provide a mathematical proof of how Ken Perlin came up with his interpolation function — an important part of the algorithm itself. From those, the solution constructs textures and applies them to specific Unity plane objects for a visual representation. To translate this into a 3D result, it constructs a spatial mesh and introduces a multiplication factor for the mesh height. At this point we could say that we have a working solution, but we came up with a problem with Unity’s internal limits: the maximum number of mesh vertices. We construct a multithreading system for building multiple map parts at the same time to solve both the vertices issue the speed issue. We solve the problem of merging these parts by passing information about the offsets and the position of their centre to be used between threads. For a further runtime improvement, we also propose a system that changes the meshes level of detail dynamically, with respect to the distance from a specific object of interest. The implementation is fast, reliable, offers a high degree of controllability, produces diverse terrain shapes and is somewhat credible. This last property is the one that needs further improvement in our oppinion, because with all our efforts, the map still doesn’t look real enough. Perhaps our solution was somewhat too focused on the map building single threaded, and the parallelisation process, even if produces a major speed improvement and manages to merge the map parts, it does not create a fully credible looking map. The next, and final section will take a look at possible future improvements, from the author’s point of view. 5.b. Improvement Plans: As we figured out in the last section, our solution requires improvements firstly on the credibility part. This happens firstly because no single area contains every available type of terrain, and secondly because our solution does not performs some sort of clustering between map parts. of 5254
That would mean that it should create multiple map components with the same type of mountains for example, to build a more realistic mountain range. This is also the case for large seas, desserts or rainforests. We can think about two ways of doing this: by randomly choosing which type of terrain to appear with a higher probability on a map part and then send this information to adjacent map building threads or by removing a type of terrain from an already generated map part. Also, an example coming through our mind would be to subtract 2 height matrices one from another, if we want to build a more realistic island archipelago. Since the topic we study is only a smaller element from a much bigger problem, respectively the building of a complete auto-generative computer game, we could try to address that. This includes the implementation of a weather system, a network of roads, settlements, intelligent agents, quests to create storylines or even game objectives and rules. We cannot think of an actual limit to this area of research, because it joins pretty much anything that can be seen in the actual world or what can be imagined. The last step into this process could be to introduce an additional intelligence level, so that the game could modify itself to fit to player’s behaviour and needs. It is a well known fact that people tend to show their real nature more and more in the virtual world and not in the real one so the development of such an application could prove beneficent for humanity to learn more about itself. However, we shall remain somewhat reserved regarding this matter. No implementation of anything near close to something like that exists and it could be also an ethical question if it shall ever exist or not.
of 5354
References: 1.Noor Shaker, Julian Togelius, and Mark J. Nelson (2016). Procedural Content Generation in Games: A Textbook and an Overview of Current Research. Springer. ISBN 978-3-319-42714-0. 2.Wittgenstein, L.: Philosophical Investigations. Blackwell (1953) 3.Mifrah Ahmad, Lukman Ab Rahim, Kamisah Osman and Noreen Izza Arshad (2017). Towards Modelling Effective Educational Games Using Multi-Domain Framework. Encyclopedia of Information Science and Technology, Fourth Edition. IGI Global. DOI 10.4018/978-1-5225-2255-3.ch291 4.Jamie "Thrrrpptt!" Madigan (June 2001). "Half-Life: Blue Shift". Archived from the original on December 16, 2008. Retrieved 2009-04-02. 5.https://www.economist.com/the-economist-explains/2014/09/24/why-video-games-are-so-expensive-to-develop 6.http://vgsales.wikia.com/wiki/Most_expensive_video_games 7.Korn, O. ; Lee, N. (Eds.). Game Dynamics. Best Practices in Procedural and Dynamic Game Content Generation. 2017. 177 p. 92 illus., 77 illus. in colour., Hardcover. Springer. ISBN: 978-3-319-53087-1 8.Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. Evolutionary Computation, IEEE Transactions on 6(2), 182–197 (2002) 9.https://mrl.nyu.edu/~perlin/doc/oscar.html#noise 10.https://mzucker.github.io/html/perlin-noise-math-faq.html#computation 11. David S. Ebert, F. Kenton Musgrave, Darwyn Peachey, Ken Perlin, Steven Worley: Texturing & Modelling: A Procedural Approach, Morgan Kaufmann 2003, 3rd edition; 12.http://libnoise.sourceforge.net/glossary/ 13.https://docs.unity3d.com/Manual/ExtendingTheEditor.html 14. Libor Váša. Methods for dynamic mesh size reduction. 2006. Technical Report No. DCSE/TR-2006-07. University of West Bohemia.
of 5454
Copyright Notice
© Licențiada.org respectă drepturile de proprietate intelectuală și așteaptă ca toți utilizatorii să facă același lucru. Dacă consideri că un conținut de pe site încalcă drepturile tale de autor, te rugăm să trimiți o notificare DMCA.
Acest articol: LUCRARE DE LICENȚĂ Functional Map Creation propusă de Ilau Marius-Constantin Sesiunea: Iulie, 2018 Coordonator științific Lect. Dr. Arusoaie Andrei… [617691] (ID: 617691)
Dacă considerați că acest conținut vă încalcă drepturile de autor, vă rugăm să depuneți o cerere pe pagina noastră Copyright Takedown.
