-
3D Engines: Used for 3D games, simulators, modelers, imaging apps, etc. Examples include Unreal Engine (royalty-based) and Godot (free).
Description: 3D engines create and display three-dimensional scenes, often utilizing advanced features such as dynamic lighting, physics, and reflections.
-
2D Engines: Used for 2D games, UI systems, drawing apps, etc. Examples include Cairo (free 2D engine) and GLUI (free UI library).
Description: 2D engines primarily focus on two-dimensional graphics, with better support for typography, vector shapes, and other elements specific to 2D applications.
-
Custom Engines: Designed for special or legacy situations and platforms, like Doom-style "two and a half D" BSP renderers.
Description: Custom engines are tailored to meet specific requirements or to support particular platforms, providing unique rendering solutions for niche use cases.
-
Realtime vs. Non-realtime: Realtime engines are used for interactive applications like games, while non-realtime engines are used for tasks such as video editing, effects, and animation sequences.
Description: Realtime engines prioritize speed and interactivity, while non-realtime engines focus on precision and quality, often at the expense of speed.
-
Technology/Algorithm/Feature-Based Engines: Characterized by their underlying techniques or supported features, these engines vary widely in their capabilities.
Examples include:
- Scanline rendering: Common in 3D engines using OpenGL, DirectX, or Vulkan.
- Raytracing: A 3D method increasingly integrated into modern engines.
- Path/shape/glyph rendering: Used in 2D systems like Windows GDI, OSX Quartz, and PostScript.
- Raymarching: Efficiently renders irregular shapes like fractals, e.g. Marble Marcher.
- REYES architecture: Found in Renderman, Aqsis, and similar engines.
Description: These engines employ various rendering techniques, algorithms, or feature sets to achieve specific visual results or optimize for certain use cases.
Render Parameters:
- Shading: The process of producing levels of light, darkness, and color onto objects in a rendered image. It can be used to apply various graphic effects and is essential for achieving realistic visuals.
- Ray Tracing: A technique that renders a scene's imagery by tracing the paths of light rays and simulating the behaviors of natural light. It provides stunning visual realism but can be computationally intensive.
- Ray Casting: A basic algorithm for rendering 3D spaces without simulating real-world light properties. It is less realistic than ray tracing but allows for high-speed, real-time applications like 3D video games and VR.
- Refraction: A technique that simulates the bending of light rays when passing through transparent or translucent surfaces. It is often used in conjunction with ray tracing to achieve more realistic visuals.
- Texture Mapping: The process of applying high-quality details, surface texture, or color information to a 3D model by projecting image maps onto the model's UV coordinates.
- Bump Mapping: A technique that simulates bumps and displacements on the surface of a 3D model, giving the illusion of realism without modifying the underlying geometry.
- Volumetric Lighting: A process that uses light as an object to create effects like fog, smoke, or steam. It enhances the lighting and distance lighting effects in a rendered scene.
- Depth of Field: A camera setting that controls the distance between the closest and farthest objects in a scene that are in sharp focus. It is used for artistic effect in renders.
- Motion Blur: A feature that simulates the frame smearing of moving objects in a rendered scene, used primarily for film VFX and animation to achieve a more lifelike appearance.