Skip to content

Conversation

@febb0e
Copy link

@febb0e febb0e commented Sep 10, 2025

Custom Metadata Feature

Closes #4409

Overview

Introduces custom metadata feature for test cases and user keywords. The implementation allows users to define metadata for both test cases and keywords using the [CustomName] syntax.

Implementation Details

  • Custom Metadata Syntax:
    • Any setting in square brackets (e.g., [Owner], [Requirement], [My Custom Metadata]) is now treated as custom metadata.
    • Custom metadata is supported in both test cases and user keywords.
    • Metadata values can be single or multi-valued, and support variable resolution.
  • Parsing and Output:
    • Custom metadata is parsed and stored in the test/keyword model.
    • Custom metadata is included in the output XML and HTML reports by default.
    • Output and parsing logic was adapted to treat all unknown square bracket settings as custom metadata.
  • Backwards Compatibility:
    • Test cases that previously expected errors for undefined square bracket settings (e.g., [UndefinedSetting]) will now treat these as valid custom metadata.

CLI Functionality: --custommetadata Flag

  • Purpose:
    • The --custommetadata flag allows users to filter which custom metadata keys are considered "valid" for display and parsing in output files.
  • Behavior:
    • No flag set: All custom metadata keys are valid and included in output and parsing.
    • Flag set with one or more keys: Only the specified custom metadata keys are considered valid and included in output and parsing. Others are ignored. Test cases and user keywords do not fail.
    • Important: The flag does not influence the execution of tests or keywords. It only affects which custom metadata is displayed in output or parsed through execution.

Test Coverage

  • Acceptance Tests:
    • Acceptance tests were created to cover major scenarios:
      • Basic custom metadata parsing
      • Edge cases (empty values, special characters, Unicode, etc.)
      • Performance (many metadata entries)
      • Variable resolution and built-in expressions
      • CLI filtering with --custommetadata
      • Output and report validation
  • Unit Tests:
    • Unit tests ensure parsing, model storage, and output generation for custom metadata.

Notable Changes

  • Error Handling:
    • Square bracket settings that do not match known settings are now always treated as custom metadata. This means tests expecting errors for undefined settings will no longer fail; instead those settings are parsed as metadata.

Summary

This feature allows users to create tests and keywords with metadata they need. Custom metadata values support the same formatting as settings metadata (*bold*, _italic_, `code`), variable resolution, multi-line content, etc. The --custommetadata flag provides control over output, without affecting test execution.

Usage Examples

Basic Custom Metadata

*** Test Cases ***
User Login Test
    [Documentation]    Test user authentication functionality
    [Owner]             Robot
    [Requirement]      REQ-001
    [Priority]         High
    [Component]        Authentication
    Log    Testing user login

Variable Resolution

*** Variables ***
${BUILD_NUMBER}       2024.1.0
${TEST_ENVIRONMENT}   ${{'Development' if os.getenv('CI') is None else 'CI/CD'}}
${TEAM_EMAIL}         qa-team@company.com

*** Test Cases ***
API Integration Test
    [Documentation]    Test API integration with dynamic metadata
    [Build Version]    ${BUILD_NUMBER}
    [Environment]      ${TEST_ENVIRONMENT}
    [Contact]          ${TEAM_EMAIL}
    [Created Date]     ${{'%Y-%m-%d' | strftime}}
    Log    Test API integration

Text Formatting

*** Test Cases ***
Formatted Metadata Test
    [Documentation]    Test with text formatting
    [Description]      *Critical* test for _user registration_ flow
    [Steps]       1. Navigate to registration page
    ...                2. Fill form with _valid test data_
    ...                3. Submit and verify *success message*
    [Bug Report]       BUG-123: See ``validate_user()`` function
    [Test Data]        File: ``test_users.json`` with *valid* entries
    [Requirements]     REQ-001    REQ-002    REQ-003
    Log    Test with formatted metadata

Inline Python Evaluation

*** Test Cases ***
Dynamic Metadata Test
    [Documentation]    Test with dynamically calculated metadata
    [System Info]      ${{platform.system()}} ${{platform.release()}}
    [Python Version]   ${{sys.version.split()[0]}}
    [Timestamp]        ${{datetime.now().isoformat()}}
    [File Count]       ${{len(os.listdir('/tmp'))}} files in temp
    Log    Test with dynamic metadata

Keyword Metadata

*** Keywords ***
Database Connection
    [Documentation]    Establish database connection with metadata
    [Owner]            DB Team
    [Timeout]          ${{'30s' if os.getenv('FAST_MODE') else '60s'}}
    [Dependencies]     [http://www.example.com|Example Dependency]
    [Performance]      *Optimized* for _high-load_ scenarios
    [Notes]            Connection enabled with ``max_connections=50``
    ...
    ...                Auto-retry on connection failure
    Log    Connecting to database
    No Operation

CLI Filtering Examples

# Include all custom metadata (default)
robot tests.robot

# Only include 'Owner' and 'Priority' metadata in output
robot --custommetadata Owner --custommetadata Priority tests.robot

# Only include 'Bug Report' and 'Test Data' metadata in output
robot --custommetadata "Bug Report" --custommetadata "Test Data" tests.robot

febb0e and others added 20 commits July 4, 2025 16:05
- Add support for custom metadata tags in test cases and keywords
- Include test file with examples of custom metadata usage
- Enhance parsing and model handling for custom metadata
- Update XML schema and output generation for metadata support
…us/robotframework into feature/custom-metadata-enhancements
- Introduced `CustomMetadata` setting in RobotSettings.
- Updated parsing and lexing logic to validate custom metadata.
- Enhanced command-line options to specify allowed custom metadata names.
- Modified relevant classes and methods to accommodate custom metadata.
…ve initialization in transformers and model classes; update tests to accommodate changes in metadata structure.
…ing names and updating documentation for command line options
…mprove validation checks, and ensure consistent initialization across settings and builders.
- Implement tests for custom metadata in the TestSuiteBuilder, covering basic filtering, keywords, special values, case sensitivity, and integration with FileSettings.
- Create tests for custom metadata modifiers, including access and modification during pre-run and execution phases, complex filtering, and error handling.
- Develop tests for UserKeyword custom metadata, ensuring handling of various data types, normalization, and integration.
- Validate custom metadata behavior with embedded arguments, setup/teardown, and keyword binding.
@Bouska
Copy link

Bouska commented Sep 26, 2025

Just my 2cts on the syntax choice.

I get the appeal of that syntax but it has too many flaws to be considered a good choice:

  • Typo Management: currently, if you make a typo like [Tag] instead of [Tags], either your IDE or robot is going to point out the problem; with this syntax, the tags set are just going to be interpreted as a metadata, the code is going to work but not with the expected behavior
  • Backward/Forward Compatibility Limitations: let say you want to have a metadata [Return], is it the old deprecated settings or just a metadata; likewise, if a futur RF release wants to add a new setting, the choice will either be 'don't do it to not break current user code' or 'maybe some existing code is going to break'
  • Principle of Least Surprise: they is currently a consistency between the test suite settings and a test case settings; e.g. for documentation it goes:
***Settings***
Documentation    Some documentation

to

My test case
    [Documentation]    Some documentation

it does not make a lot of sense to break that pattern for metadata and have something like this:

***Settings***
Metadata    My key    My value

to

My test case
    [My key]    My value

@Noordsestern
Copy link
Member

Hello @Bouska ,

thank you for the valuable feedback!

Typos have caused some discussion when developing this feature. Originally, the most popular opinion has been to let

robot  mytest.robot

fail, when it has custom metadata. You would have to provide a list of all custom metadata at launch so that typos may be prevented:

robot --custommetadata Owner --custommetadata Priority tests.robot   # would fail, if i had a typ like [Oner]

That approach has been altered for the opinion that metadata should be only metadata and not influence the default execution. Also, the use cases we know want to provide many custom metadata pairs. Adding them on CLI would not be feasible.

I like your highlight on typo management, compatibility and least surprise.

Do you have an idea how to satisfy all expectations?

@fhennig42
Copy link

I really would like to see this merged as it would help us to store custom Zephyr Scale related metadata directly in the tests.

@Bouska
Copy link

Bouska commented Oct 29, 2025

Do you have an idea how to satisfy all expectations?

I would go for the following:

My test case
    [Metadata]    My key1    My value1
    [Metadata]    My key2    My value2

So it is a mirror on how Metadata works in Settings. If you have a typo on the key, it is a you problem but it won't impact RobotFramework; if you make a typo on Metadata RobotFramework will rightfully fail. There is only one defined new setting Metadata, there is no problem with backward or forward compatibility.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Custom test settings a.k.a. test metadata

4 participants