1. 3D Engines: Used for 3D games, simulators, modelers, imaging apps, etc. Examples include Unreal Engine (royalty-based) and Godot (free). Description: 3D engines create and display three-dimensional scenes, often utilizing advanced features such as dynamic lighting, physics, and reflections.

  2. 2D Engines: Used for 2D games, UI systems, drawing apps, etc. Examples include Cairo (free 2D engine) and GLUI (free UI library). Description: 2D engines primarily focus on two-dimensional graphics, with better support for typography, vector shapes, and other elements specific to 2D applications.

  3. Custom Engines: Designed for special or legacy situations and platforms, like Doom-style "two and a half D" BSP renderers. Description: Custom engines are tailored to meet specific requirements or to support particular platforms, providing unique rendering solutions for niche use cases.

  4. Realtime vs. Non-realtime: Realtime engines are used for interactive applications like games, while non-realtime engines are used for tasks such as video editing, effects, and animation sequences. Description: Realtime engines prioritize speed and interactivity, while non-realtime engines focus on precision and quality, often at the expense of speed.

  5. Technology/Algorithm/Feature-Based Engines: Characterized by their underlying techniques or supported features, these engines vary widely in their capabilities. Examples include:

    Description: These engines employ various rendering techniques, algorithms, or feature sets to achieve specific visual results or optimize for certain use cases.

    Render Parameters:

    1. Shading: The process of producing levels of light, darkness, and color onto objects in a rendered image. It can be used to apply various graphic effects and is essential for achieving realistic visuals.
    2. Ray Tracing: A technique that renders a scene's imagery by tracing the paths of light rays and simulating the behaviors of natural light. It provides stunning visual realism but can be computationally intensive.
    3. Ray Casting: A basic algorithm for rendering 3D spaces without simulating real-world light properties. It is less realistic than ray tracing but allows for high-speed, real-time applications like 3D video games and VR.
    4. Refraction: A technique that simulates the bending of light rays when passing through transparent or translucent surfaces. It is often used in conjunction with ray tracing to achieve more realistic visuals.
    5. Texture Mapping: The process of applying high-quality details, surface texture, or color information to a 3D model by projecting image maps onto the model's UV coordinates.
    6. Bump Mapping: A technique that simulates bumps and displacements on the surface of a 3D model, giving the illusion of realism without modifying the underlying geometry.
    7. Volumetric Lighting: A process that uses light as an object to create effects like fog, smoke, or steam. It enhances the lighting and distance lighting effects in a rendered scene.
    8. Depth of Field: A camera setting that controls the distance between the closest and farthest objects in a scene that are in sharp focus. It is used for artistic effect in renders.
    9. Motion Blur: A feature that simulates the frame smearing of moving objects in a rendered scene, used primarily for film VFX and animation to achieve a more lifelike appearance.