US 2019034785241
cu») United States
2) Patent Application Publication
Massaro
oy
ow
2
a
@y
on
PROCESSING FULL MOTION
VIDEO DATA FOR PHOTOGRAMMETRIC
RECONSTRUCTION,
Applicant: United States of America - US Army
Alexandria, VA (US)
Inventor: Richard D. Massaro, Burke, VA (US)
Appl. No. 1877204133
Files Sep. 29, 2017
Publication Clasiieation
i
Goor 1710
Gost 1705:
(2006.01)
(2005.01)
ao) Pub, No.: US 2019/0347852 Al
(43) Pub. Date Nov. 14, 2019
G06K 1678 (200601)
GO6F 167735 (200501),
GOOF 1675 (2006.01)
GO6E 17/10 (2013.01); GO6E 1705
(2013.01); GosF 16/75 (2019.01): GOOF
16/735 (201001); GO6E 16/78 (2019.01)
on ABSTRACT
This invention is a system for photogrammetric analysis of
full motion video (FMV), which converts FMV to ima
files, extracts metadata, and produces accurate 2-D and 3-D
geospatial images in real time.Patent Application Publication Nov. 14,2019 Sheet 1 of 11 US 2019/0347852 Al
Step 1: Receiving
FMV data
Step 2: Parsing FMV
data to create image
files and metadata files
Step 3: Performing
interpolation algorithm
to complete metadata
set
Step 4: Iteratively calculating | | Step 5: Iteratively calculating
and storing focal length values and storing angle values
Step 6: Populating
position value matrix
data object
eee
Step 7: Filtering using a
proximity threshold
value
¥
Step 8: Filtering using a
cluster count threshold
| Step 9: Verifying cluster
| identification
I (optional)
| Step 10: Associating
| cluster index value with
| each stored image file
Fig. 1Patent Application Publication Nov. 14,2019 Sheet 2 of 11 US 2019/0347852 Al
30
Receiver
, 20
30
Extractor
¥
40
Focal length and |“
angle processor
50
Data matrix |“
object
i
‘ey fitering
Proximity filtering
processor
x
y10
Cluster count
filtering processor
:
Verification |“ ®°
processor
¥
/ 90
Indexing
component
Fig. 2Patent Application Publication Nov. 14,2019 Sheet 3 of 11 US 2019/0347852 Al
430k 310390 130n
| } | 0b
i
Fig. 3Patent Application Publication Nov. 14,2019 Sheet 4 of 11 US 2019/0347852 Al
Fig. 4APatent Application Publication Nov. 14,2019 Sheet 5 of 11 US 2019/0347852 Al
(
f=1.058x, : fame
: | tan| FOV) } tan FOV, f }
\ . « \ fej)
Fig. 4BPatent Application Publication Nov. 14,2019 Sheet 6 of 11 US 2019/0347852 A1
Fig. 5APatent Application Publication Nov. 14,2019 Sheet 7 of 11 US 2019/0347852 Al
Fig. 5BPatent Application Publication Nov. 14,2019 Sheet 8 of 11 US 2019/0347852 Al
Fig. 5CPatent Application Publication Nov. 14,2019 Sheet 9 of 11 US 2019/0347852 Al
Fig. 6APatent Application Publication Nov. 14, 2019 Sheet 10 of 11 US 2019/0347852 Al
Fig. 6BPatent Application Publication Nov. 14,2019 Sheet 11 of 11 US 2019/0347852 A1
Ahh ae
10.3, NS
” hf SN ae
eZ x a
we eo AN Spor 2708 oD
, ne ine (7%
Fig.7US 2019/0347852 Al
METHOD OF PROCESSING FULL MOTION
VIDEO DATA FOR PHOTOGRAMMETRIC
RECONSTRUCTION
STATEMENT REGARDING FEDERALLY
‘SPONSORED RESEARCH OR DEVELOPMENT
[0001] The invention described hercin was made by aa
‘employee of the United States Goverament and may be
‘manufactured and used by tbe Government of the United
States of America for governmental purposes without the
payment of any royalties,
FIELD OF INVENTION
[0002] |The invention relates tothe fleld of photogrammet-
ric image analysis, and more specifically 10 a system for
rapidly processing full motion video in real-time.
BACKGROUND OF THE INVENTION
10003] Photogrammetry i the science of making measure-
‘ments from photographs and for eeovering the exact posi-
tions of surface points Photogrammetric computational
methods draw upon optics, mathematics and projective
eometry:
[0004] The U.S. Army Comps of Engineers (USACE) is
‘conducting advanced research to meet the challenge of using
photogrammetric techniques to transform millions of tera-
bytes of Full Motion Video (FMV) data from unmanned
vehicles, wireless cameras and other sensors ito usable 2-D
tnd 3-D maps and models
[0005] To apply photogrammetric computational models,
EMV data gathered ffom airplanes, stellites, and other
sources must be parsed into sil frame images. The stil
frame images contain coordinates and meta data which are
‘exacted lor processing by computational moll. The exte-
Flor orientation of the camera of sensor defines its locaton
in space and its view direction, The innee orieatation defines
the geometric parameters of the imging process, The focal
Jength of the imaging lens isa component of these geometeic
parameters.
[0006] By feeding the measurements from each individu-
ally processed image frame into a computational modal, its
possible to rapidly’ and) accurately estimate 3-D relative
‘motions and other measurements for each frame, It curentiy
js not possible Toto produce accurate 3-D models in real
time, on a consistent basis, because of limitations inherent in
the frame selection process. Frame selection is cosily and
time-consuming process that is currently performed by
manually by human techaicians
10007] "The current state of at requires human intervention
to fier relevant the frames which contain tamget objects
from frames are irrelevant ane should be excluded from
‘computational analysis
[0008] There is an unmet need for inteligent, automated
technologies which can rapidly select relevant image frames
for processing from vast amounts of video data,
10009] Theres an unmet nee for automated technologies
‘which ean detect and filter frames which are distorted oF
cthenvise of insulicent quality for accurate processing.
[0010] Theresa further unmet need fora fully automated
‘method for processing FMV data that can address anomalies
‘in acquired date and fiter image frames which are unsuitable
for processing in real-time.
Nov. 14, 2019
SUMMARY OF THE INVENTION
[0011] This invention is un intelligent system for rapidly
processing full motion video (FMV) through interpolation
racesses to rapidly replicate missing or comuptd data and
Algorithmic selection of relevant image frames. Tis system
includes processing components and memory storage for
‘metadata and image files.
[0012] The system includes multiple viral processing
‘components which perform data parsing processes, metadata
extraction processes, data interpolation processes and iter
ing processes using data collected from other virtual pro-
‘essing components,
[0013] Novel processing components iteratively perform
‘zcometric calculations of angle measurements and focal
Tengths on extracted metadata associated with image files
‘Virnual processing components perform repid filtering pro-
cesses. Other vimual processing components perform novel
‘threshold computation snd chister analysis, including cluster
indexing, This method and system allow the rapid progne-
tion ofhighly-accurate2-D and 3-D images in real time. The
‘increased processing speed of the system is achieved by
conliguration of system components to perform novel eo
‘mettie calculations rather than by increased. processing
capability.
BRIEF DESCRIPTION OF DRAWINGS
[0014] FIG. 1 illustrates an exemplary computerimple-
sented method for processing FMV data for attomated
photogrammetric reconsimetion.
[0018] FIG. 2 illustrates an exemplary embodiment of @
real-time filtering sytem for rapid processing of FMV sill
image frames,
[0016] FIG. 3 iloseates an exemplary process of acquire
ing data which is subject to metadata loss, which is eoreeted
‘rough interpolation processes,
[0017] FIG. 4A illustrates an exemplary collection of
sense data set whichis extracted and used by the system to
Caleulate the focal length parameter
[0018] FIG, 4B illustrates an exemplary process for eae
culating the foal length fr each image file which suse by
the system for calculating FOV postion coordinates
{0019} FIG. 8A ilusirates an exemplary collection of @
sensed data Set use bythe sytem ealelate photogram-
metric relevant angles.
[0020] FIG. $B illustrates an exemplary process for cal-
culating the effective photogrammetric yaw (or heading)
angle ofthe seasor relative tothe FOV.
[0021] FIG, SC illustrates an exemplary process for cal-
culating the effective photogrammetric pitch angle of the
sensor relative to the FOV:
[0022] FIG. 6A itlastrates an exemplary process for eal-
culating the distance between FOV position coordinates
[0023]. FIG. 618 illustrates an. exemplary system outp
hic sa 2-D cluster map depicting the location of clusters
te system has identified
{0024} "FIG. 7 illustrates an exemplary proces of sampling
mage frames extemal tothe cluster to verity that the set of
mage files associated with a cluster is correctly indexed.
TERMS OF ART
[0025] As used herein, the tem “angle” means an angles
‘of a sensor relative to a real-world coordinate plane.US 2019/0347852 Al
10026] _As used herein, the term
tion of filtered image
‘when filed}
10027] As used herein, the tenn “cluster count threshold”
‘means the minimum number of image files which most be
associated with a cluster,
[0028] As used herein, the term “color threshold value"
‘means a minimum red green blue (RGB) valve that an image
file must have to be inchaded in a cluster
10029] As usod herein, the term “Field of View" (FOV)
means the ares recorded by the sensor in each Video Tame.
10030] "As used horen, the teem “Focal length means
value used to determine image size or sale
10031] As used herein, the term “Full Motion Video
(EMY) means video taken from a eamera or other sensor
including and motion imagery at a minimum speed of
approximately 25 frames per second with metadata, consis
‘wat with Department of Defense Motion Imagery Standards
Board (MISB) format
[0032] "As used herein, the tom “image file" means
video frame that is sored in an image fle format as pixels
having a red groen blue (RGB) color val,
10033] "As used herein, the tem “Key-Length-Value"
(KLV) isa format for encoding data
10034) "As used herein, the term “metadata values” means
measurements including field of view dimensions, and Iati-
tude, Iongitude, and aliiude position coordinates of the
sensor that collects the FMV data
10035] As used herein, the term "National Imagery Trans-
mission Format” (NITF) means a US. Department of
Defense (DoD) and Federal Ineligence Community (IC)
suite of standards for the exchange, storage, and transmis-
sion of digtal-imagery products and image-elated products.
10036] As used herein the term “object” means a process
‘ng component that contains both data and dats structures
‘and code which performs operations on the dala structures,
10037] As used herein, the term “position coordinates”
means geographic location with variables selected from
_aroup consisting of latitude, longitude, and alltde
10038) As used heron, the teem “processor” means hard-
‘ware or sofivate having processing capability which may be
bound to non-modifiable values and funetions.
10039] As used herein, the term “proximity threshold
means «maximum distance between image fle position
‘coordinates that determines which video frames will be
Jncluded in a custr
[0040] As used herein, the term “vietal processing com-
ponent” refers to software which performs @ computational
process and fnctions identically to the eieuitry of a physi-
cal processor
DETAILED DESCRIPTION OF THE
INVENTION
[0041] FIG. 1 illustrates an exemplary computerimple-
mente! method for processing FMV’ data for automated
photogrammetri reconstruction 100,
10042] In various embodiments, Method 100 can be used
to produce two- or threedimensional geographic data that
‘correspond with the subject ofthe FMV or te location ofthe
sensor that collects the FMV data, Method 100 automat-
cally processes FMV data for delivering rapid two-dimen:
sional (2-D) and three-dimensional 3-D) geospatial outputs
[0043] In various embodiments, the datasource may be #
‘data fle or streaming data souree. The streaming data source
Nov. 14, 2019
nay be continuous o aon-contiauous, The FMV can be
reaming and does not have to be continuous
[044] Method 100 calculates the focal length (f) and
Photogrammetric angles for each video frame. These paran-
ers are needed for automated photogrammetric eeonstruc-
tion and includes a novel clustering algorithm to process
filtered subset of FMV frames based on the geometry ofthe
sensor trajectory
0045} In various embodiments, user input may’ include
FMV Ile or steam, estimated elevation of the FMV subject
if the frame center elevation is not present within the
location metadata, and a desired the number of frames
required to define a cluster (cluster count threshold).
[0046] Method 100 calculates necessary angles and focal
length required for automated photogrammetric reconsiric-
‘ion fom FMV sources. The clustering algorithm is per-
{ormed using the data from the angle calculation and focal
length calculation. Method 100 further may be automated
‘using various programming scripts and tools such asa novel
Python script generator
[047] Step 1 isthe step of receiving FMV dat, including
Video files and metadata files. The system can roceive either
ale or a steam and will process either dataset. Data may
be received ina continuous oF non-continuous manner: ‘The
system can process discontinuous data whieh aevommodates
various points of view (eg, repositioning of the sensor).
[0048] Step 2 is the step of parsing FMV data to ereate
{mage files and metadata les associate wih each inage fie
Image files and metadata files ate iteratively processed,
stored aad indexed as data is extracted. In this sep, sensor
positon coordinates and field of view (FOV) dimensions are
extracted from metadata sets associate with each image fle
0049} Each image file includes, but, isnot limited to: 2
still image copied corresponding toa Video frame from aa
FMV, having pixels and color (red green blve, or RGB
values). The metadata set includes, but isnot limited to the
following variables: latitude, longitude, and altitude of the
sensor and vertical FOV dimension FOV, and horizoatal
FOV dimension FOV,
[0050] In various embodiments, metadata is encode into
USS. Department of Defense (DoD) FMV data using the
‘Motion Imagery Standards Board (MISB) Key-Length-
Value (KLV) format
[0051] Step 3 is the step of pecforming an interpolation
algorithm to complete the metadata set
[0052] This step averages multiple sensor position coor
‘inates overtime and uses the average to populate missing
‘metadata values.
[0053] Step 4 is the step of iteratively calculating and
storing the focal length vale foreach image file from FOV
simensions
[0054] This step uses the following variables: vertical
FOV dimension FOV, and borizontal FOV dimension
FOV,, Focal length () is calculated by the following
orm, which assumes thatthe sensor bas @ 4" Tom,
esas
[0055] Step 5 is the step of calculating and storing pho-
togrummetrc relevant angle vals, In this sep, the systeUS 2019/0347852 Al
calculates yaw (o) and pitch (@) angles for each image file
fom sensor position coordinates
10056] This sep uses the following variables to caleulate
‘yaw angle (a): the Vertical change in positon of the sensor
(ay), and the horizontal change in position of the sensor
(4x). The yaw angle (0) Tommi
10057] This step uses the following variables to calculate
Pitch angle (g): distance between sensor and field of view
(R,) and altitude of sensor (h). The pitch angle (g) formula
[0058] Step 6 is the step of populating a postion value
matrix data object with position coordinates (longitude,
latitude, and altitude) of the field of view (POV) in each
image file
10089] The position value matrix data object ineludes bot
‘data structures and viewal processing capability which cal
culates the distance (@) between longitude and latitude
positon values (x, and y) foreach pair of image files. The
Tollowing formula caltistes distance (2) based on position
values x and y:
Noa
10060}. Step 7 isthe step of filtering
thresbod vale.
10061) This proximity threshold value is calelatd as a
pereetage ofthe horizontal fed of view dimension (FOV,)
nd vertical field of view dimension (FOV,
{0962} This tp performs filtering proces by comparing
Aistace (do the proxinty threshold value, Only image
files which have @ distance smaller than the proximity
threo vale ar filler and stored fr further processing
Inthis step, a counter i incremented to track the number of
stored files which meet the proximity vale threshold to
‘biain a count of stored image files.
10063}. Step 8 i the step of filtering using a closer count
threshold, The cluster count threshold i user dined vale
that determines the minimum number of eames which must
be stor during the previous sep to determine i elise
‘exists. This frame count in the previous step is compar
the euster eount tres
10064] Step 8 is the optional step of vesting eluser
‘Menifeation. In this tp, imoge files which suround iden-
Ged elutes are sspled to ince all mage fs relevant
tothe suse. This step samples image files having position
‘vores (4, y) exten (0 the cluster perimeter The
Sampling pater is based on calculated yaw angle (0.
10065} Step 10 isthe step of iteratively associating @
‘ster index value with eich stoned image
[0066] In various embodiments, Method 100 creates @
Toldee for each clster and stores chsterassociated image
files in the corresponding cluster olde.
{0067} In various embodiments, Method 100 produces
2D and 3D geospatial proves rom FMY data
a proxi
y
Nov. 14, 2019
[0068] FIG, 2 ilusteates an exemplary embodiment of a
‘computer system having hardware and software components
and used for rpid processing of FMV still image frames
200. System 200 is a computer having a processor and
‘memory. In the exemplary embodiment shown, the user
input of PMY file or strsam is require. la various embodi-
ments, system 200 also requires user input ofthe estimated
scene elevation if the frame center elevation is not present
‘within the metadata valves.
[0069] In various embodiments system 200 autoenatically
processes and delivers rapid 2-D and 3-D geospatial prod
‘cts derived from, eurrelly Hekded FMV platforms across
the DoD,
[0070] tn sill other embostiments, system 200 may be
ted to process FMV data to create 2-D and 3-D geospatial
ata from US. DoD (MISB-compliant) and international
defense organizations’ sisbome FMV sources. Other poten-
tial uses include underwater EMV bathymetric reconstruc-
‘ion and mobile video collection from ground vehicles,
[0071] Receiver 10 is a processing component for receiv-
ing, FMV data from ether a file or data stream
[0072] Parser 20 a processing component which performs
fan algorithm for parsing FMV data into image files and
‘metadata files
[0073] Fxtractor 30 is @ processing component which
extracts metadata valves
[0074] Focal length and angle processer 40 is & virus
processor which iteratively caleulates the focal length (0)
and photogrammetric yaw angle («) and pitch angle (g) for
tach image fle.
[0078] Data matrix object $0 is a processing component
Tor storing position coordinates associated with an image file
‘and calculating the distance between position eoonlinates
for image file pair
[0076] Proximity Filtering processor 60 is a viral pro-
cessor for filtering image files by comparing the distance
between pairs of image file position coordinates to the
proximity thesbold value
[0077] Cluster count processor 70 isa processing compo-
rent which compares User-defined cluster count threshold
value to the total image file couat w the cluster count
threshold
[0078] Verification processor 80 performs processes 10
verify inclusion of all relevant image files in cluster by
sampling image files extemal to the cluster, s determined by
position coordinates (x,y) and calculated yaw angle (o)
[0079] Indexing component 90 isa processing component
Which assoeiates a elnster index value with each stored
image file.
[0080] In various embodiments, System 200 rejects video
Tames with clouds obseuring the target FOV through
histogram analysis routine wherein the user sets @ minima
color threshold that is compared (0 the RGB value of each
image fie
[0081] In other embodiments, System 200 exports cluster=
associated image files and automatically generates @ novel
sript fr use i photogrammetric reconstruction software, t0
automatically execute the photogrammetrie reconstruction
‘using eommane-line call Geospatial products (2-D and ¥D)
Aare automatically exported, including position eoondinates
‘ofthe sensor path, clostr-assciated image files with meta-
ata, and cluster center position coordinates. Upon eomple-
tion, a notice is provided tothe user with the location of the
eospatal product filesUS 2019/0347852 Al
10082] FIG. 3 illustrates an exemplary process of aque
{ng data which is subject to metadata loss. Ths metadata loss
js corrected through interpolation processes
10083] In the exemplary embodiment shown, an FMV
metadata fle containing Sensor positional data is missing
metadata packets 100-c, During Step 2b of Method 100,
ritadata Values are extrated from FMV daa, Often meta
‘data isnot updated regularly and packets are missing. In this,
‘case, an interpolation funetion uses available metadata pack-
‘ets 10 calculate metadata valves for missing metadata pack=
10084) FIG. 3 illustrates sensor trajectory 120, field of
View 140, available metsdata packets 130a-p, and lines of
missing metadata packets 1a.
10085] FIG. 44 illustrates an exemplary collection of @
sensed data set which is exacted and used by System 200
to calculate the focal length parameter. The exemplary
‘embodiment shown inclides extracted data used for eae
Jatin, focal length 150
10086] FIG. 4A further illustrates sensor 160, Focal length
150, field of view 140, horizontal field of view dimension
(FOV,,) 180, and vertical field of view dimension (FOV ,)
170, ln Step 3a of Method 100, focal length 150 is ealeulted
based on the horizontal Held of view dimension FOV) 180
‘and verical field of view dimension (FOV,) 170.
[0087] FIG. 4B illsrates an exemplary process for cal-
‘culating the focal length for each image ile which is used by
the system for calculating FOV position coordinates. The
‘exemplary formula assumes that the sensor format is
‘The formula receives FOV horizontal and vertical dimen-
sion variables (FOV, and FOV, as input and produces &
cal Tength value (J) a output.
(9108 rong * aaron
10088] FIG. 84 illustrates an exemplary collection of @
sensed dataset used by the system to ealeulate photogram-
metric relevant angles. Specifically, this data ts usd for
‘calculating yaw angle («) 210 and pitch angle (7) 190 for
‘each image ile. Fach calculated yaw angle (0) 218 and pitch
angle (7) 190 is associated with an image fle and required
for ealeuating POV position eoontinates. Calculated yaw
angle (0) 210i also use for verifying eluster identification.
10089] FIG. 8A further illustrates the position of sensor
160, field of view 140, distance between sensor an fekl oF
view (R,) 220, altitude of sensor (h) 230, sensorelaive yaw
angle (0) 210 and pitch angle (g) 190 Inthe embodiment
shown, sensor 160 is mounted to ansircraft. Inthe embodi-
ment shown, the FMV metadata information is used to
‘aleulate yaw angle () 210 and pitch angle (9) 199. In most
‘embodiments, the rol angle is negligible
10090] FIG. 81 illustrates an exemplary process for ale
culating the horizontal angle of orientation of the sensor
relative w the FOV: This angles refered 10 asthe yaw angle
(0). Yaw angle (o) 210 js separately caleutated for each
image file. Yaw angle (0) 240 5 used for calculating FOV
Position coordinates, andi fanher sed for verifying cluster
‘Menification, In the exemplary embodiment shown, this
Calculation receives extracted metadata parameters includ
ing the change in altude 230 of sensor 160 (Ay) and the
Nov. 14, 2019
change in lateral positon of sensor 160 (Ax) a input and
produces a yaw angle (9) as ouput
[0091] FIG. SC illustrates an exemplary process for eal-
calating the vertical angle of orientation of the sensor
relative to the FOV. This angle is referred to as the pitch
‘angle (g) 190 which is used for ealeuating POV position
coordinates. The exemplary formula shown receives the
‘Variables distance between sensor an Held of view (R.), and
lide of sensor (h) 28 inpot and proioce pitch angle (9)
value as output
[092] FIG. 6A illustrates an exemplary process for cal-
ealatng the distance berween FOV position coordinates,
(NTO
0093] FIG. 68 illustrates an exemplary system output
is. 2-D cluster map depicting the location of clusters
System 200 has ideatfiod. Te cluster map illustrates clus-
{ers 240 and 2406, The cluster map identifies FOV position
coordinates of image files associated with each cluster The
clusters age associated with indexed individual image files
[0094] In the exemplary embodiment shown, clusters are
defined! based on to system thresholds: the proximity
threshold value and the cluster count threshold
[0095] The proximity threshold value is the maximum
allowed distance between image file coordinates. This is
Selermined as a percentage of extracted metadata values
represent the horizontal field of view dimension
(OV,,), and vertical fed of view dimension (FOV, ),
[0096]. The cluster count threshold refers to 2 minimum
‘number of image files which moet the proimity threshold
Tor a cluster to be recognized.
[0097] An agglomerative clustering algorithm is used 10
Secide which image files to use for cach 3D reeonstnition
‘The clustering algorithm is based on ealelated FOV posi-
tion enordinates of image files
[0098] FIG. 7 ilusiates an exemplary process of sampling
‘mage frames extemal to the cluster to verify thatthe set of
‘mage files associated with a cluster is correctly indexed,
0099} FIG. 7 further illustrates cluster 240, user-defined
sistance 260, FOV position coordinates of qualifying image
files 270a-h, and cluster center 280. System 200 automa
cally samples imuge files haying FOV position coordinates
Jocsted within user-defined distance 260 of cluster center
280 in pattem based on the calculated yaw angle 210,
‘What is claimed is
1, A method! to transform Full Motion Video (EMV) 10
phoiognimmetric data Which is implemented on computer
system having 4 memory and a processor and comprised of
the steps of:
receiving FMV data;
parsing stid FMV data to create image files and metadata
filesUS 2019/0347852 Al
extracting meta data values from said
including sensor position coordinate val
view dimensions
calculating focal Tengih values for each of said image
files;
calculating angles foreach of said image He
calculating position value coordinates from sad angles;
calculating the distance (8) for images image files pairs
fiom said position vale cooedinates;
creating filled image files by comparing distance (2) to
‘proximity value threshold; and
‘comparing the number of filtered image files filtered 10 3
cluster eount threshold to identify clusters
2. The method of claim 1 wherein said sensor postion
‘coordinate valves and field of view dimensions include
values seleeted from a group consisting of latitude, longi-
tude, and altitude ofthe sensor and vertical FOV dimension
FOVY and horizontal FOV disension FOVH,
3. The method of elaim 1, wherein said metadata files are
Key-Length-Value (KLV) files.
‘4 The method of elaim 1, which further inludes the step
‘of performing an interpolation algorithm,
5. The method of claim 4, whercin ssid interpolation
algorithm is determined by computing averages of sensor
Position coordinates over tine and populates said metadata
values.
6. The method ofelaiy 1 which farther ineludes the step
of calculating said proximity threshold by caleulating 3
Percentage of the POV horizontal and vertical dimension
Variables (FOVH and FOVV),
"7. The method of claim I wherein the step of ealewlating
«distance () further includes the step of iteratively popula
ing a position value matrix data object having processing
‘capability to calculate distance (4) and compare distance (@)
to user defined thresholds.)
8, The method of elim 7 which farther incudes the step
of iteratively comparing distance (@) 10 said proximity
threshold valve
9, The method of claim 1, which further includes the steps
‘of incrementing a stored image file count,
10, The method of claim 1, which further includes the
steps of comparing said stoned image file count 10 said
cluster count threshold to detemmine if cluster exists
11. The method of claim I which further inclodes asso-
iatng imoge files witha Value unique to a singlelusters.
12. The method of elaim 1 wherein the step of calculating
‘angles includes extracting values selected trom & group
‘consisting of one or more sensor altitude change values (AY),
‘one or snore lateral postion change Values of suid sensor
(4x), and one or more values representing distance between
said Sensor and said fleld of view (Re), and one or more
values representing altitude of said sensor (h)
13. The method of claim wherein the sep of calculating
angles incudes calculating one or more angles selected from
8 group consiting of yaw angle (opie angle) aad a
Pitch angle
Nov. 14, 2019
14, The method of claim 1, wherein said FMV data is
imported from a group of FMV sources consisting ofa file
and a live steam
10, The method of claim 1, sai live stream is selected
from a group consisting of 4 continvous live stream, 3
continuous live sam and an intermittent live stream
16. The method of claim 1, where said method further
includes the step of filtering sad image files having an RGR
value below a User-defined color threshold value
17. The method of claim 1, which further includes the step
of ereating a geographic dats set corresponding to said FMV
sata,
18, The method of claim 1, which further includes the step
creating a photogrammetric datafile which may be pro-
cessed by photogmmmctric analysis software
19, A computer system having memory and processing
‘eomponents including:
1 receiver component for receiving FMV data;
4 parsing component for receiving FMV data and parsing
said FMV data input to produce image files andl meta-
data files a8 ouput
‘an extraction component for receiving suid metadata fles
‘a8 input extracting and storing si metadata values
focal length processing component which receives FOV
horizontal and vertical dimension variables (POV,, and
FOV, ) as input and produces focal length value (3s
‘utp
fn angle processing component which receives sensor
‘values as as input and produces yaw angle (o) value
and pitch angle (¢) values as output;
data matsix object which includes at least one data
staucture for storing position coordinate values and a
‘virtual processing component for calculating distance
{@) between position coordinate valves for image file
pairs
first virtual processor which iteratively roseives values
for distance (J) as input for each image frame and
‘computes said distance () a proximity vale thrsh-
‘ld to filters said image files for futher processing:
4 cluster verification viral processing component which
iteratively compares image file position coordinates to
cluster center coordinates, of center of sad luster and
said yaw angle (0); and
‘nd an indexing component for associating a caster index
‘value wih cach image ile to idenify the cluster with,
which the image is associated
20. The method of claim 19 which further includes the
step of calculating sensor position values selected from a
‘cup consisting of variables selected from a group of
‘Variables consisting of one oF more sensor altitude change
values (Ay), one or more lateral position ehange values of
‘(4x one or more values representing distance
sensor said field of view (R.) and one or more
values representing altitude of suid sensor (h),