top of page

Fact Checking Methodology

Technical Verification Protocol

UFO/UAP Claims

With this methodology, the work of our site is not to "prove" or "deny" UFO claims, but to offer a technically substantiated, reproducible and transparent assessment of evidence without exaggeration, alarmism or bias.

Fact Checking

yyh.png
Primary Claim Recording
(Claim Logging & Typology)

Each claim is registered in an internal incident tracking system with a unique ID and structured fields (structured claim registry).

Basic recording fields:
  • Chronostamp (UTC & local time)

  • Geolocation (lat/long in WGS84, altitude, margin of error)

  • Claim type (visual sighting, radar track, SIGINT intercept, photographic evidence, eyewitness testimony, etc.)

  • Source tier (primary / secondary / tertiary)

  • Evidence medium (video, image, telemetry, textual, hybrid)

  • Metadata integrity score (based on completeness & reliability)

Τυπολογία περιστατικών
(UAP Classification Taxonomy):
  • Class A - Verified physical or sensor-based data

  • Class B - High fidelity photographic/video evidence

  • Class C - Testimony or anecdotal reports without corroboration

  • Class D - Claims of unknown or unverifiable origin

Primary & Secondary Source
(Source Vetting & Provenance Analysis)

The reliability of information depends primarily on the traceability of its source provenance.

Control steps:
  • Verification of origin (e.g. IP geolocation, EXIF geotags, DNS traces on online sources).

  • Cross referencing source identities (OSINT methods: reverse image search, WHOIS, archives).

  • Assessment of witness credibility (background consistency check, timeline validation).

  • Chain of custody evaluation: has the content been altered from the time of first publication to our access?

Tools:
  • FotoForensics (ELA analysis)

  • InVID (video verification toolkit)

  • ExifTool (metadata extraction)

  • Archive.org (temporal versioning)

Audiovisual Material Analysis
(Forensic Media Analysis)

Photo and video verification is based on digital forensics techniques.

Metadata Extraction & Integrity

  • EXIF parsing → GPS, timestamp, camera make/model

  • Hash verification (SHA-256) for integrity checks

  • Error Level Analysis (ELA) for alterations

  • Bit-level inspection in containers (MP4, MOV, JPEG) for trace editing

Geospatial Verification

  • Reverse geolocation via Google Earth & Sentinel Hub

  • Sky orientation validation (azimuth, elevation, heading)

  • Shadow length / sun angle reconstruction for temporal alignment

  • Star map & celestial object cross-check via Stellarium or Heavens Above

Motion & Trajectory Analysis

  • Frame by frame vectorization (optical flow algorithms)

  • Trajectory reconstruction (parallax correction, camera movement compensation)

  • Comparison with flight paths (ADS-B / Mode-S data from Flightradar24 or ADS-B Exchange)

  • Speed/altitude estimation (using known reference points)

Cross-Section with Independent Data

(Correlative Data Fusion)

For each claim, data fusion is performed from multiple sources, with the aim of eliminating explained causes (IFO elimination).

  • Data categories to cross-reference:

  • Meteorological (METAR, TAF, atmospheric soundings)

  • Astronomical (planets, satellites, meteorites)

  • Civil/military air traffic (ADS-B, NOTAMs)

  • LEO/GEO satellite data (e.g. NASA Heavens Above DB, Celestrak)

  • Sensor systems (radar, SIGINT reports, if available via FOIA or leaks)

Probability Analysis & Ranking

(Assessment & Classification)

Each claim is evaluated with a multiparametric risk scoring model.

-Evaluation criteria:

  • Source reliability (R-score)

  • Media integrity (M-score)

  • Correlation strength (C-score)

  • Explainability index (E-index) - probability of natural/anthropogenic explanation

-Ending categories:

  • Verified Event 0 high R/M/C, low E-index

  • Indeterminate / Unresolved 0 moderate or contradictory data

  • Explained / Hoax 0 documented explanation or hoax detection

  • Pending / Under Investigation 0 awaiting additional information

Documentation & Transparency

(Documentation & Transparency Protocol)

isdd.png

The research steps are recorded in an audit trail (immutable log).

  • Each finding is accompanied by a citation, link or hash reference.

  • The methods and tools are made public along with the evaluation.

  • The files (raw evidence) are stored offline for integrity reasons.

 

-Standards:

  • RFC 3161 (trusted timestamping)

  • SHA-256 / PGP signatures for hardware verification

  • Transparency reports with changelog for each claim

 

The scientific process requires constant revision.

 

  • Incident files are reviewed periodically (or when new data emerges).

  • New versions of assessments are released with versioning (v1.0, v1.1, etc.)

  • Each change is documented publicly on the incident report page.

Testimony Categorization

Level A

Multiple credible eyewitnesses

Pilots, astronomers, military, professionals

Level B

A reliable eyewitness

Accurate description, without contradictions

Level C

Anonymous or poorly documented testimony

Unclear or unconfirmable

Level D

Unverified social media reports

Clickbait, memes, rumors

Reliability Factors

Each report is evaluated based on 5 key criteria, each with a score of 0 to 5 points.

Maximum total: 25 points

Location & Conditions

Clear geographical location, weather conditions known
5

Number & credibility of witnesses

Many eyewitnesses or reliable sources
5

Material (Photo/Video/Radar)

Quality, authenticity and confirmation of the material
5

Control/verification capability

Cross-checking data with independent sources, technical analysis
5

Absence of alternative explanation

If it is not easily explained by natural phenomena
5

Reliability Scale

0–7

Unreliable 🔴

Incomplete or conflicting information

8–14

Low 🟠

Limited data, requires further verification

15–20

Medium 🟢

Serious report with good indications but some gaps

21–25

High 🔵

Very well-documented, reliable report  requires scientific examination

Optional Indicators

  • Radar corroboration (e.g. military or civil radars)

  • Cross-reference with satellite data

  • Object flight analysis (speed, direction, height)

  • Analysis by independent experts

 

 

These indicators can add +1 to +5 bonus points in cases of very good data.

Evaluation Example

  • Location/Conditions: 5

  • Number of witnesses 4: 5

  • Material: good analysis + radar corroboration: 5

  • Verification: 4

  • Alternative explanation: difficult → 4

  • Radar Bonus: +3

Total: 26/30 → Category: 🔵 High reliability
DSC_0078_edited.jpg

UAP - UFO Incident Report

Share your experience safely

bottom of page