Sunteți pe pagina 1din 93

Optical designs

for fundus imaging


From traditional desktop
to novel optical design in small
form factors

Petteri Teikari, PhD


Singapore Eye Research Institute (SERI)
Visual Neurosciences group

http://petteri-teikari.com/
Version “Wed 10 October 2018“
“Traditional”
Fundus
Imaging
Optical
Designs
Intro to 2009
Fundus camera systems:
a comparative analysis
Fundus Optics https://doi.org/10.1364/AO.48.000221
Edward DeHoog and James Schwiegerling
external
illumination
internal
illumination
Design Applied Optics Vol. 48, Issue 2, pp. 221-228 (2009)
design design
Retinal photography requires the use of a
complex optical system, called a fundus
camera, capable of illuminating and imaging
the retina simultaneously. The patent literature Results of 100 Monte Carlo Trials of Fundus Camera Systems
shows two design forms but does not
provide the specifics necessary for a thorough
analysis of the designs to be performed.

We have constructed our own designs based


on the patent literature in optical design Tolerance analysis must always be considered when
software and compared them for illumination determining which system is able to perform a specific task
efficiency, image quality, ability to better. Systems with high performance metrics but extremely
accommodate for patient refractive error, and tight or impossible tolerances are likely to be passed over for
manufacturing tolerances, a comparison production or redesigned to make manufacturing easier
lacking in the existing literature. Kidger, Intermediate Optical Design (SPIE, 2004)
Shannon, The Art and Science of Optical Design (1997)CVI Melles Griot, “
Optical fabrication tolerances”
Rochester Precision Optics, “Traditional optics capability”.

“Retinal imaging presents a unique difficulty considering that the retina must be illuminated and imaged
simultaneously, a process which forces illumination and imaging systems to share a common optical path. Because
the retina is a minimally reflective surface, the power of the back reflections from the shared optics of the
illumination and imaging paths is greater than the power reflected by the retina. “
Traditional
fundus camera
design

Optical diagram of a traditional digital fundus camera with its three basic modulus – objective, illumination and camera.
De Matos et al. (2015), http://doi.org/10.1117/12.2190515
De Oliveira et al. (2016), http://doi.org/10.1117/12.2236973
Fundus
Cameras
Commercial
Landscape #1
Vishwanath Manik Rathod
http://raiith.iith.ac.in/4141/1/Thesis_Mtech_EE_4141.pdf
Fundus
Cameras
Commercial
Landscape #2
Vishwanath Manik Rathod
http://raiith.iith.ac.in/4141/1/Thesis_Mtech_EE_4141.pdf

Next Sight Nexy Tutorial - fundus camera SIR Oftalmica


https://youtu.be/JxmFyhFRN3g
Nexy Robotic Retinal Imaging System Receives FDA ... - Eyewire News

Global Fundus Cameras Market to be worth


USD 620 Million By 2024 - Zion Market Research
https://globenewswire.com/news-release/2018/08/
19/1553691/0/en/Global-Fundus-Cameras-Market-
to-be-worth-USD-620-Million-By-2024-Zion-Marke
t-Research.html
Fundus Cameras Market: by Product Type (Mydriatic Fundus Cameras
[Tabletop and Handheld], Non-mydriatic Fundus Cameras [Tabletop and
Handheld], Hybrid Fundus Cameras, and ROP Fundus Cameras) and by
End User (Hospitals, Ophthalmology Clinics, and Others): Global Industry
Perspective, Comprehensive Analysis and Forecast, 2018 - 2024
Pen-like fundus December 2009
US8836778B2 Portable fundus camera
https://patents.google.com/patent/US8836778B2/en
camera design Filipp V. IGNATOVICH, David M. Kleinman, Christopher T.
Cotton, Todd Blalock LUMETRICS Inc

Legalese description: “Camera for imaging the fundus of an eye, the camera comprising optics aligned along an
imaging axis intersecting a point on the fundus and configured to focus light reflected back from the fundus onto an image
receptor, wherein the optics are capable of varying a field of view of the camera along a path circumferentially around the
point on the fundus, whereby the image receptor acquires images of portions of the fundus located at different peripheral
locations around the point of the fundus”
Spectral September 2010
Spectral characterization of an
ophthalmic fundus camera
Characterizatio https://doi.org/10.1117/12.844855
Clayton T. Miller; Carl J. Bassi; Dale Brodsky;
n of typical Timothy Holmes

fundus camera This work describes the characterization of


one system, the Topcon TRC-50F, necessary
for converting this camera from film
photography to spectral imaging with a
CCD. This conversion consists of replacing the
camera's original xenon flash tube with a
monochromatic light source and the film back
with a CCD. A critical preliminary step of this
modification is determining the spectral
throughput of the system, from source to
sensor, and ensuring there are sufficient
photons at the sensor for imaging.
Dynamic 2003
Time Course of Fundus Reflection Changes
November 2016
Retinal venous pulsation: Expanding our
According to the Cardiac Cycle understanding and use of this enigmatic
Artifacts https://iovs.arvojournals.org/article.aspx?articleid=2413124
R.P. Tornow; O. Kopp; B. Schultheiss
phenomenon
https://doi.org/10.1016/j.preteyeres.2016.06.003

Cardiac gating To compare the time course of fundus reflection from


William H. Morgan, Martin L. Hazelton, Dao-Yi Yu

video sequence (25 frames/sec) at different retinal Recently, improved ophthalmodynamometry and video
for fundus locations with cardiac parameters. recording techniques have allowed us to explore the
fundamentals of retinal vein pulsation. This demonstrates that
retinal venous collapse is in phase with both IOP and CSFP
The pulsatile reflection component ΔR(t) R(t) diastole, indicating the dependence upon CSFP pulse. We
changes corresponding to the cardiac cycle. ΔR(t) R(t) describe in some detail the mathematical and physical models of
Starling resistors and how their results can be applied to
rises suddenly during systole, reaches its maximum understand the physiology of retinal vein pulsation.
after about 32 % of the pulse duration time (RR-
interval) and decreases towards the end of the
diastole. The pulse shape of ΔR(t) R(t) shows a high
correspondence to the cardial impedance signal
while it is different from the pulse shapes of the
peripheral impedance signals.

The reflection of the ocular fundus depends


on the cardiac cycle. The simultaneous
assessment of ΔR(t) R(t) and the impedance signals
allows to correlate parameters of ocular
microcirculation with cardiac parameters and
October 2017 Automatic Detection of Spontaneous Venous Pulsations Using
to distinguish physiologically induced Retinal Image Sequences https://doi.org/10.1007/978-3-319-68195-5_90
Michal Hracho, Radim Kolar, Jan Odstrcilik, Ivana Liberdova, Ralf P. Tornow
reflection changes from artifacts. More than Evaluation of magnitude of spontaneous venous pulsation has been proven to
this, the pulsatile reflection amplitude has to be taken correlate with occurrence of glaucoma. Based on this relation a method is
proposed that might help to detect glaucoma via detection of spontaneous venous
into consideration for quantitative imaging like retinal pulsation.
densitometry.
2010

Modify High-resolution hyperspectral imaging of the retina with a modified fundus camera
Nourrit V, Denniss J, Muqit MM, Schiessl I, Fenerty C, Stanga PE, Henson DB.
http://doi.org/10.1016/j.jfo.2010.10.010

Existing fundus This paper gives information on

camera for custom how to convert a standard


fundus camera into a
hyperspecral camera with off
the shelf elements (CCD

applications camera, liquid crystal filter,


optical fibre and slide lamp
projector).

Technically, its main limitation is


the low transmission of the filter
(20% max for unpolarized light
below 650 nm), which limits
imaging below 460 nm.
June 2011

Remidio US9398851B2 Retinal imaging device


https://patents.google.com/patent/US9398851B2/en
https://patents.google.com/patent/WO2018047198A3/en
Sivaraman Anand, Kummaya Pramod,
Smartphone-based Nagarajan Shanmuganathan
remidio innovative solutions pvt ltd

commercialized
fundus camera
https://doi.org/10.1038/s41433-018-0064-9
http://remidio.com/
2012
MobileVision mobileVision system
http://www.ti.com/corp/docs/university/docs/Rice_University_mobileVision%20Final%20Report.pdf

A Portable, Scalable
Retinal Imaging System
TI Engibous Competition Report
Rice University
June 2011

Remidio US9398851B2 Retinal imaging device


https://patents.google.com/patent/US9398851B2/en
https://patents.google.com/patent/WO2018047198A3/en
Sivaraman Anand, Kummaya Pramod,
Smartphone-based Nagarajan Shanmuganathan
remidio innovative solutions pvt ltd

commercialized
fundus camera
https://doi.org/10.1038/s41433-018-0064-9
http://remidio.com/
Fundus Self- February 2012
US9295388B2 Methods and apparatus This invention comprises apparatus for retinal self-
for retinal imaging imaging. Visual stimuli help the user self-align his eye with a
Imaging “Eye https://patents.google.com/patent/US9295388B2/en
Matthew Everett Lawson, Ramesh Raskar
camera. Bi-ocular coupling induces the test eye to rotate into
different positions. As the test eye rotates, a video is captured
Massachusetts Institute of Technology of different areas of the retina. Computational
Selfie” from MIT photography methods process this video into a mosaiced
image of a large area of the retina. An LED is pressed
against the skin near the eye, to provide indirect,
diffuse illumination of the retina. The camera has a wide
field of view, and can image part of the retina even when the
eye is off-axis (when the eye's pupillary axis and camera's
optical axis are not aligned). Alternately, the retina is
illuminated directly through the pupil, and different parts of
a large lens are used to image different parts of the retina.
Alternately, a plenoptic camera is used for retinal imaging.

Computational photography techniques are


used to process the multiple images and to
produce a mosaiced image. These techniques
include (i) “Lucky” imaging, in which high-
pass filtering is used to identify images that
have the highest quality, and to discard poorer
quality images
Fundus Self-aligned, mobile, non-mydriatic Fundus
Photography. The user is presented with an
alignment dependent fixation cue on a ray-based

Eye-Selfie display. Once correctly aligned, a self-acquired retinal


image is captured. This retinal image can be used for
health, security or HMD calibration. Ilustration: 
Laura Piraino

http://web.media.mit.edu/~tswedish/projects/eyeSelfie.html https://youtu.be/HuXgrbwOjvM
T. Swedish, K. Roesch, I.K. Lee, K. Rastogi, S. Bernstein, R. Raskar. eyeSelfie: Self Directed
Eye Alignment using Reciprocal Eye Box Imaging. Proc. of SIGGRAPH 2015 (ACM
Transactions on Graphics 34, 4), 2015.

Expert-free eye alignment and machine learning for


predictive health Tristan Breaden Swedish
https://dspace.mit.edu/handle/1721.1/112543

I will present a system that includes a novel method for eye self-alignment and automatic image
analysis and evaluate its effectiveness when applied to a case study of a diabetic retinopathy
screening program. This work is inspired by advances in machine learning that makes
accessible interactions previously confined to specialized environments and trained users. I will
https://www.economist.com/science-and-technology/2015/0 also suggest some new directions for future work based on this expert-free paradigm.
6/13/retina-selfie
Fundus
Self-
Alignment

http://web.media.mit.edu/
~tswedish/projects/eyeS
elfie.html

“The subject
aligns his/her
own for
optimal image”
Cited By (14)
Check the US20160302665A1 *2015-04-172016-10-20Massachusetts Institute Of Technology Methods and Apparatus
for Visual Cues for Eye Alignment

patent trail US20170000454A1 *2015-03-162017-01-05Magic Leap, Inc. Methods and systems for diagnosing eyes
using ultrasound
WO2009081498A1 *2007-12-262009-07-02Shimadzu Corporation Organism image capturing device
EP2583619A1 *2011-10-222013-04-24SensoMotoric Instruments GmbH Apparatus for monitoring one or
more surgical parameters of the eye
US20150021228A12012-02-022015-01-22Visunex Medical Systems Co., Ltd. Eye imaging apparatus and
systems
US9655517B22012-02-022017-05-23Visunex Medical Systems Co. Ltd. Portable eye imaging apparatus
US9351639B22012-03-172016-05-31Visunex Medical Systems Co. Ltd. Eye imaging apparatus with a wide
field of view and related methods
US9237847B22014-02-112016-01-19Welch Allyn, Inc. Ophthalmoscope device
US9211064B22014-02-112015-12-15Welch Allyn, Inc. Fundus imaging system
US9986908B22014-06-232018-06-05Visunex Medical Systems Co. Ltd. Mechanical features of an eye
imaging apparatus
US9675246B22014-09-152017-06-13Welch Allyn, Inc. Borescopic optical system for medical diagnostic
instruments and medical diagnostic instruments having interlocking assembly features
EP3026884A1 *2014-11-272016-06-01Thomson Licensing Plenoptic camera comprising a light emitting
device
US9848773B22015-01-262017-12-26Visunex Medical Systems Co. Ltd. Disposable cap for an eye imaging
apparatus and related methods
US20160320837A1 *2015-05-012016-11-03Massachusetts Institute Of Technology Methods and Apparatus for
Retinal Retroreflection Imaging
Design for November 2014
Design, simulation and experimental
analysis of an anti-stray-light
minimizing illumination system of fundus camera
https://doi.org/10.1117/12.2073619
straylight Chen Ma; Dewen Cheng; Chen Xu; Yongtian
Wang

e.g. use polarized light


source

A fiber-coupled, ring-shaped light source that forms an


annular beam is used to make full use of light energy. The
parameters of the light source, namely its divergence angle
and size of the exit surface of the fiber rod coupler, are
determined via simulation in LightTools. Simulation results
show that the illumination uniformity of fundus can reach to
90% when a 3~6mm annular spot on the cornea is
illuminated. It is also shown that a smaller divergence
angle (i.e., 30°) benefits both the uniformity irradiance of the
fundus image on focus plane (i.e., CCD) and the sharpness
of the image profile. To weaken the stray light, a
polarized light source is used, and an analyzer plate
whose light vector is perpendicular to the source is placed
after the beam splitter in the imaging system. Simulation
shows the average relative irradiance of stray light after stray
light elimination drops to 1%.
PEEK Retina
Smartphone-based
fundus imaging
Panretinal 2.2 2014
3D Printed Smartphone Indirect Lens
Adapter for Rapid, High Quality Retinal
3D Printed optics Imaging
http://dx.doi.org/10.7309/jmtm.3.1.3
holders David Myung, Alexandre Jais, Lingmin He, Mark S.
Blumenkranz, Robert T. Chang
Byers Eye Institute at Stanford, Stanford University School
of Medicine
3D Ophthalmoscope 2015
Study of optical design of three-
Design dimensional digital ophthalmoscopes
https://doi.org/10.1364/AO.54.00E224
Yi-Chin Fang, Chih-Ta Yen, and Chin-Hsien Chu

A 3D optical-zoom sensory
system of the human eye with infrared
and visible light is proposed (Code V,
LightTools) to help the doctor diagnose
human eye diseases. The proposed
lens design for 3D digital
ophthalmoscopes provides a good
means of simultaneously
accessing infrared and visible
light band information to help doctors
perform diagnostics.

The diffraction limit lines in MTF plots


are almost >0.7 at spatial frequencies
up to 40 cycles/mm under all zoom
conditions in the IR region. According to
the experiment results, the proposed
3D digital ophthalmoscope is suitable
for future ophthalmoscope design.
LightTools diagram
D-Eye 2015
A Novel Device to Exploit the
Smartphone Camera for Fundus
3D Printed optics Photography
http://dx.doi.org/10.1155/2015/823139
holders Andrea Russo, Francesco Morescalchi, Ciro Costagliola,
Luisa Delcassi, and Francesco Semeraro

Exploded view of the D-Eye module (angles and


distances between components are approximated).
Retinal images are acquired using coaxial illumination
and imaging paths thanks to a beam splitter (C). The
blue arrow depicts the path of the light; red arrow
depicts the path of fundus imaging. Device
components are glass platelet (A) with imprinted
negative lens (A′), photo-absorbing wall (B), beam
splitter (C), mirror (D), plastic case (E), diaphragm (F),
polarized filters (G, H), flash and camera glass (J, I),
and magnetic external ring (K).
Tunable Liquid 2015
Accessible Digital Ophthalmoscopy
Based on Liquid-Lens Technology
Lens https://doi.org/10.1007/978-3-319-24571-3_68
Christos Bergeles, Pierre Berthet-Rayne, Philip McCormac, Luis C. Garcia-

with transpupillary
Peraza-Herrera, Kosy Onyenso, Fan Cao, Khushi Vyas, Melissa Berthelot,
Guang-Zhong Yang

This paper demonstrates a

illumination to simplify new design integrating modern


components for
ophthalmoscopy. Simulations
the optical design show that the optical elements
can be reduced to just two
lenses: an aspheric
ophthalmoscopic lens and a
commodity liquid-lens,
leading to a compact
prototype.

Circularly polarised
transpupilary illumination,
with limited use so far for
ophthalmoscopy, suppresses
reflections, while
autofocusing preserves
image sharpness. Experiments
with a human-eye model and
cadaver porcine eyes
demonstrate our prototype’s
clinical value and its potential
for accessible imaging when
cost is a limiting factor.
Simplifying September 2016
Evaluation of retinal illumination in
coaxial fundus camera
Optical https://doi.org/10.1117/12.2236973
André O. de Oliveira; Luciana de Matos; Jarbas
Design C. Castro Neto

We evaluated the impact of this substitution


regarding to image quality (measured
”substitute the complex through the modulation transfer function) and
illumination uniformity produced by this system
illumination system by a ring of on the retina.

LEDs mounted coaxially to the The results showed there is no change in image
quality and no problem was detected
imaging optical system, concerning uniformity compared to the
positioning it in the place of the traditional equipment. Consequently, we
avoided off-axis components, easing the
holed mirror of the traditional alignment of the equipment without
reducing both image quality and illumination
optical design.” uniformity.

Photograph (left) and optical drawing (center) of the OEMI-7 Ocular Imaging Eye Model (Ocular
Instruments Inc.). Picture of the OEMI-7 Ocular Imaging Eye Model (right) obtained using the
innovative equipment. No obscuration is observed and the image is free of reflex.
Optimizing November 2016
Minimising back reflections from the common path
objective in a fundus camera
Zemax tools for https://doi.org/10.1117/12.2256633
A. Swat Solaris Optics S.A
efficient Eliminating back reflections is critical in the design of a Commonly used amongst optical specialists, is

modelling of fundus camera with internal illuminating system. As there is very


little light reflected from the retina, even excellent antireflective
Zemax software, it does allow to call a macro script
from the system merit function by the ZPLM operand.
Therefore the optimisation system comprise the
coatings are not sufficient suppression of ghost reflections,
fundus cameras therefore the number of surfaces in the common optics in
following:
illuminating and imaging paths shall be minimised. ●
The imaging system built in Zemax

Typically a single aspheric objective is used. In the paper an ●


A typical merit function constructed to allow for the
alternative approach, an objective with all spherical imaging system optimisation
surfaces, is presented. As more surfaces are required, more
sophisticated method is needed to get rid of back reflections. ●
An additional line in the merit function to call a
Typically back reflections analysis, comprise treating macro, operand ZPLM
subsequent objective surfaces as mirrors, and reflections from
the objective surfaces are traced back through the imaging ●
The macro called by the ZPLM operand shall open
path. a new file, where the second system defined,
including as many configurations as the number of
surfaces in the common path suspected to
There are also available standard ghost control merit generate parasite back reflections. The macro
function operands in the sequential ray-trace, for example in evaluates detector flux, for each configuration and
Zemax system, but these don’t allow back ray-trace in an it up, macro is closed and total flux value returned
alternative optical path, illumination vs. imaging. What is to the imaging system merit function. The retuned
proposed in the paper, is a complete method to incorporate flux becomes one of the merit function
ghost reflected energy into the raytracing system merit function components being minimised among other
for sequential mode which is more efficient in optimisation imaging system properties, the weight is
individually adjusted by the designer to balance
process. Although developed for the purpose of specific case
system properties.
of fundus camera, the method might be utilised in a wider
range of applications where ghost control is critical.
Startups Nov. 2016
Phelcom [University of São Paulo (USP)], Smart Retinal Camera (SRC), a retinal scanner controlled
by an attached smartphone
focusing on the http://revistapesquisa.fapesp.br/en/2017/05/17/eye-on-the-smartphone/

software stack

The SRC is designed to perform three kinds of fundus exams: color, red-free, and fluorescein
angiography (FA).  Paulo Schor, professor in the Department of Ophthalmology, of the Federal University of São
Paulo (Unifesp), devices that rely on smartphones to perform eye exams do not belong to the future but to the present.
“They’re accessible – that is, easy to operate and cheap,”
Miniaturized March 2017
Optical design of portable nonmydriatic
fundus camera
nonmydriatic https://doi.org/10.1117/12.2268699
Weilin Chen; Jun Chang; Fengxian Lv; Yifan He;
fundus camera Xin Liu; Dajiang Wang

design The ocular fundus is not luminous itself, and the


reflectivity of retina to visible light is about Sony ICX282AQ CCD
0.1% to 10%. If the light blocking effect of pupil
is considered, the reflectivity of the fundus is
about 0.1% to 1%. The active illumination
is therefore needed for the total low reflectivity.
The fundus camera uses two kinds of LED as To evaluate the performance of the lighting system, the
light sources, one is 590 nm LED and the optimization results from Zemax were imported into
other is 808 nm LED. The pulsed 590nm Lighttools, and the human eye model was also added
LED is used to illuminate the capillary vessel in to perform a non-sequential ray tracing. The illumination
the ocular fundus and take pictures for the high distribution in the fundus is shown
contrast of the fundus images.

Schematic of annular illumination


Smartphone 2017
Optical Design of a Retinal Image Acquisition Device for Mobile
Diabetic Retinopathy Assessment
Fundus imaging https://doi.org/10.1007/978-3-319-24571-3_68
David Simões de Melo, Bachelor Degree in Engineering Sciences, NOVA University of Lisbon

with design choices


laid out
Smartphone 2017
A Portable, Inexpensive, Nonmydriatic Fundus Camera
Based on the Raspberry Pi® Computer
Fundus imaging https://doi.org/10.1155/2017/4526243
Bailey Y. Shen and Shizuo Mukai
with design choices Department of Ophthalmology, Illinois Eye and Ear Infirmary, University of Illinois at Chicago;
Retina Service, Massachusetts Eye and Ear Infirmary, Harvard Medical School

Our prototype, or a camera based on our


laid out We built a point-and-shoot prototype camera using a Raspberry Pi
computer, an infrared-sensitive camera board, a dual infrared and prototype, may have myriad uses for health
white light light-emitting diode, a battery, a 5-inch touchscreen liquid professionals. For example, it may be useful
crystal display, and a disposable 20-diopter condensing lens. Our for ophthalmologists seeing inpatient consults,
prototype camera was based on indirect ophthalmoscopy with both as moving inpatients to a stationary fundus
infrared and white lights. Results. The prototype camera measured and camera can be impractical, and many
weighed 386 grams. The total cost of the components, including the
neurosurgery inpatients in the intensive care
disposable lens, was $185.20.
unit are not allowed to be pharmacologically
dilated.

The comfort of nonmydriatic imaging may


make the camera useful for pediatric
ophthalmologists, although alignment might be
difficult. Finally, the low cost and small size of our
camera may make the camera a valuable tool
for ophthalmologists practicing global medicine.
With added features such as a large memory
card and a strong wireless card or cell phone
antenna, the device could help providers
practice telemedicine.
Open-source 2018
μCube: A Framework for 3D Printable OptomechanicsCube: A Framework for 3D Printable Optomechanics
http://doi.org/10.5334/joh.8 | https://mdelmans.github.io/uCube
optics design Mihails Delmans, Jim Haselofff (2018) University of Cambridge
Journal of Open Hardware. 2(1), p.2

blocks
accelerating basic
design

For attaching a commercial photo camera lens, a µTMountFace is used, which features a T-Mount adapter
ring, obtained from a commercial T-Mount adapter. In the M12 CCTV camera lens version, both the lens and
the Raspberry Pi Camera are held together by a single part. CAD design in OpenSCAD.
Modeling the May 18, 2018
The entrance pupil of the human eye
https://doi.org/10.1101/325548
pupil/iris as the Geoffrey Karl Aguirre

imaging The precise appearance of the entrance pupil


is the consequence of the anatomical and
aperture optical properties of the eye, and the relative
positions of the eye and the observer. This paper
presents a ray-traced (Matlab), exact model
eye that provides the parameters of the entrance
pupil ellipse for an observer at an arbitrary
location and for an eye that has undergone
biologically accurate rotation

Calculation of the virtual image location of a pupil


boundary point This 2D schematic shows the cornea and
a 2 mm radius pupil aperture of the model eye. A camera is
positioned at a 45◦ viewing angle relative to the optical viewing angle relative to the optical
axis of the eye. The optical system is composed of the
aqueous humor, the back and front surfaces of the cornea,
and the air. We consider the set of rays that might originate
from the edge of the pupil. Each of these rays depart from
the pupil aperture at some angle with respect to the optical
axis of the eye. We can trace these rays through the optical
system
“Wide-Field”
Fundus
Imaging
Optical
Designs
Smartphone - July 2018
A smartphone-based tool for rapid,
portable, and automated wide-field retinal
based wide- imaging
https://doi.org/10.1101/364265
field fundus Tyson Kim, Frank Myers, Clay Reber, PJ Loury, Panagiota Loumou, Doug Webster, Chris Echanique,
Patrick Li, Jose Davila, Robi Maamari, Neil Switz, Jeremy Keenan, Maria Woodward, Yannis Paulus, Todd
Margolis, Daniel Fletcher
Department of Ophthalmology and Visual Sciences, University of Michigan School of Medicine; Department of

imaging
Bioengineering and Biophysics Program, University of California, Berkeley; Department of Ophthalmology, University of
California, San Francisco; Department of Ophthalmology and Visual Sciences, Washington University School of
Medicine in St. Louis; Department of Physics and Astronomy, San José State University; Chan-Zuckerberg Biohub, San
Francisco, CA

High-quality, wide-field retinal imaging is a valuable


method to screen preventable, vision-threatening
diseases of the retina.

Smartphone-based retinal cameras hold promise


for increasing access to retinal imaging, but variable
image quality and restricted field of view can limit The fixation target is translated through a series
of positions, re-orienting the patient’s eyes and
their utility. We developed and clinically tested a retina in a rapid and controllable fashion.
smartphone-based system that addresses these
challenges with automation-assisted imaging. CellScope Retina was
capable of capturing and
The CellScope Retina system was designed to stitching montage wide-
improve smartphone retinal imaging by combining field, 100-degree images
automated fixation guidance, photomontage, and of a broad variety of retinal
multi-colored illumination with optimized optics, pathologies in the
user-tested ergonomics, and touch-screen nonoptimal imaging
interface. System performance was evaluated from conditions of an ophthalmic
images of ophthalmic patients taken by non- consultation service and
ophthalmic personnel. emergency department
setting.
Phantom January 2018
Quantifying Retinal Area in Ultra-Widefield
Imaging Using a 3-Dimensional Printed Eye
development for Model
https://doi.org/10.1016/j.oret.2017.03.011
retinal imaging

The grids in the original


image (left) is traced using
Photoshop CS2 (Adobe,
San Jose, CA; middle). In
this example, the line
thickness is set at 5 pixels
for ease of the reader;
however, in determining the
area, this was set at 1 pixel
for increased accuracy. The
traced image which was
used to determine the area
of each ring in pixels using
Design of the model eye with an axial length of 24 mm with section A-A
representing the coronal plane and section B-B, the sagittal plane (top left). The
ImageJ (bottom).
radius of the model is 13 mm (top right). The walls of the model eye have a
thickness of 2 mm. Each model is made up of multiple rings centered at the
posterior pole with each ring separated by 9 as in the image. The top right image
represents the sagittal plane and bottom left image represents the coronal plane.
The bottom right image represents the model eye viewed externally.
Wide-field 2016
Posterior Segment Distortion in Ultra-
2017
Can ultra-wide field retinal imaging replace
Widefield Imaging Compared to colour digital stereoscopy for glaucoma
fundus image Conventional Modalities
https://doi.org/10.3928/23258160-20160707-06
detection?
https://doi.org/10.1080/09286586.2017.1351998
quality in National Institute for Health Research Moorfields Biomedical Research Centre, Moorfields Eye Hospital
and University College London Institute of Ophthalmology, London

In conclusion, this study demonstrated almost


clinical practice perfect agreement between colour digital
stereoscopy and the Optomap, an ultra-wide
field imaging technique when assessed by a
glaucoma specialist. It also showed the UWF
technique was reproducible in VCDR estimates.
Our data suggest that UWF imaging may be
suitable for diagnosing glaucoma in situations in
which slit-lamp biomicroscopy or digital colour
stereoscopy are not available and further research
about the comparative diagnostic performance of
UWF and other imaging technologies may be
warranted.

2018
Peripheral Retinal Imaging Biomarkers for
Alzheimer’s Disease: A Pilot Study?
https://doi.org/10.1159/000487053

Whether ultra-widefield (UWF, Optos P200C AF)


retinal imaging can identify biomarkers for
Alzheimer’s disease (AD) and its progression. …
The proposed averaging of images taken 90° apart can after clinical progression over 2 years,
improve the quality of the images obtained using the Optos suggesting that monitoring pathological changes in
system. An acknowledgment and correction of this posterior the peripheral retina might become a valuable tool
segment distortion will increase the accuracy that the
revolutionary Optos system has to offer. in AD monitoring.
Clinical Reviews April 2016
ULTRA-WIDEFIELD FUNDUS IMAGING: A Schematic illustration
Review of Clinical Applications and Future of ultra-widefield
Trends imaging (Optos) of the
http://doi.org/10.1097/IAE.0000000000000937 retina using an
ellipsoidal mirror.

Over the last 40 years, several innovative A laser light source is


approaches to wide-angle fundus imaging have reflected off the
been introduced. These include the Pomerantzeff galvanometer mirrors
camera, the Panoret-1000, the RetCam, and various onto an ellipsoidal mirror.
The second focal point of
contact and noncontact lens-based systems. the mirror resides within
These instruments can provide 100° to 160° the eye, which facilitates
panoramic photographs using either traditional image acquisition anterior
fundus photography or confocal SLO (cSLO). to the equator.

A major disadvantage of several of these


Optos ultra-widefield fluorescein
approaches, including the Pomerantzeff camera, angiography of proliferative
the Panoret-1000, the RetCam, and the Staurenghi diabetic retinopathy. Right (A)
lens, is the utilization of a contact lens which and left (B) eyes of a patient
requires a skilled photographer to hold the with scattered microaneurysms,
camera and lens in place during image acquisition peripheral capillary
nonperfusion, and focal leakage
consistent with
A major source of frustration for retinal neovascularization elsewhere.
physicians has been the difficulty associated with The peripheral
neovascularization and
creating fundus drawings in these electronic nonperfusion are not
systems (EHR). A potential solution would be the detectable using traditional
seamless integration of an UWF color or seven-field fundus imaging (C
angiographic image into the examination note that and D).
could be supplemented with annotations by
the physician to note the important findings.
Deep learning 2017
Accuracy of deep learning, a machine-
July 2018
Deep-learning Classifier With an Ultrawide-
learning technology, using ultra–wide-field field Scanning Laser Ophthalmoscope
with wide-field fundus ophthalmoscopy for detecting
rhegmatogenous retinal detachment
Detects Glaucoma Visual Field Severity
https://doi.org/10.1097/IJG.0000000000000988
retinal imaging https://doi.org/10.1038/s41598-017-09891-x
To evaluate the accuracy of detecting glaucoma
This study had several limitations. When clarity of visual field defect severity using deep-learning (DL)
the eye is reduced because of severe cataract or classifier with an ultrawide-field scanning laser
dense vitreous haemorrhage, capturing images ophthalmoscope.
with Optos becomes challenging; thus, such cases
were not included in this study.

May 2018
Accuracy of ultra-wide-field fundus
ophthalmoscopy-assisted deep learning, a
machine-learning technology, for detecting
age-related macular degeneration
https://doi.org/10.1007/s10792-018-0940-0

A combination of DCNN with Optos images is not


better than a medical examination; however, it can
identify exudative AMD with a high level of
accuracy. Our system is considered useful for
screening and telemedicine.
“Animal”
Fundus
Imaging
Optical
Designs
Modeling the April 2011
Novel non-contact retina camera for the rat
and its application to dynamic retinal vessel
optics of the rat analysis
https://doi.org/10.1364/BOE.2.003094
eye with ZEMAX A novel optical model of the rat eye was
developed for use with standard ZEMAX optical
design software, facilitating both sequential and
non-sequential modes. A retinal camera for the
rat was constructed using standard optical and
mechanical components. The addition of a
customized illumination unit with Xenon fiber-
coupled light source and existing standard software
enabled dynamic vessel analysis.
Optimizing October 2015
Investigating the influence of chromatic
aberration and optical illumination
fundus image bandwidth on fundus imaging in rats
https://doi.org/10.1117/1.JBO.20.10.106010
quality for a rat Noninvasive, high-resolution retinal imaging of
model rodent models is highly desired for longitudinally
investigating the pathogenesis and
therapeutic strategies. However, due to severe
aberrations, the retinal image quality in
rodents can be much worse than that in
λpeak = 580 nm humans.

hbw = 19 nm We numerically and experimentally investigated the


influence of chromatic aberration and optical
illumination bandwidth on retinal imaging. We
confirmed that the rat retinal image quality
decreased with increasing illumination bandwidth.
We achieved the retinal image resolution of 10  μCube: A Framework for 3D Printable Optomechanicsm
using a 19 nm illumination bandwidth centered
at 580 nm in a home-built fundus camera.

Furthermore, we observed higher chromatic


aberration in albino rat eyes than in pigmented
rat eyes. This study provides a design guide for
high-resolution fundus camera for rodents. Our
method is also beneficial to dispersion
compensation in multiwavelength retinal imaging
applications.
Contact lenses July 2018
Effect of a contact lens on mouse retinal in
vivo imaging: Effective focal length changes
for water- and monochromatic aberrations
https://doi.org/10.1016/j.exer.2018.03.027
immersion For in vivo mouse retinal imaging, especially with
imaging Adaptive Optics instruments, application of a
contact lens (with GelTeal) is desirable, as it allows
maintenance of cornea hydration and helps to
prevent cataract formation during lengthy
imaging sessions.

However, since the refractive elements of the eye


(cornea and lens) serve as the objective for most in
vivo retinal imaging systems, the use of a contact
lens, even with 0 Dpt. refractive power, can alter
the system's optical properties.

In this investigation we examined the effective


focal length change and the aberrations that
arise from use of a contact lens.

Based on the ocular wavefront data we evaluated


the effect of the contact lens on the imaging system
performance as a function of the pupil size. These
results provide information for determining The effect of a contact lens and
optimum pupil size for retinal imaging without gel on ocular aberration is
adaptive optics, and raise critical issues for complex. In our system, the use
of a contact lens introduced
design of mouse optical imaging systems vertical coma and spherical
that incorporate contact lenses. aberrations above those of the
native eye.
Tunable goggle July 2017
Optical modelling of a supplementary
tunable air-spaced goggle lens for rodent
lens for rodent eye imaging
https://doi.org/10.1371/journal.pone.0181111
models In this study, we present the concept of a tunable
goggle lens, designed to compensate individual
ocular aberration for different rodent eye powers.
Ray tracing evidences that lens-fitted goggles
permit, not only to adjust individual eyes
power, but also to surpass conventional adaptive
correction technique over large viewing angle,
provided a minimum use of two spaced liquids. We
Example of a multi-element lens-fitted goggle rigidly fastened
believe that the overlooked advantage of the to an optical system: The goggle lens having a corneal matching
3D lens function is a seminal finding for further index (see Jiang et al. 2016 for details) is made of a plurality of liquid
technological advancements in widefield retinal filled cavities, having a distinct refractive index, separated by elastic
surface membranes that enable a static correction of the eye by
imaging. restructuration of the rodent cornea.
Improving 2018
Nonpupil adaptive optics for visual
simulation of a customized contact lens
contact lens https://doi.org/10.1364/AO.57.000E57

modelling itself We present a method for determining the


deformable mirror profile to simulate the optical
for all imaging effect of a customized contact lens in the central
visual field. Using nonpupil-conjugated adaptive
optics allows a wider field simulation compared to
studies traditional pupil-conjugated adaptive optics. For a
given contact lens, the mirror shape can be derived
analytically using Fermat’s principle of the
stationary optical path or numerically using
optimization in ray-tracing programs such as
Zemax.

Diagram of the aspheric contact lens and


schematic eye model.
“Video-based”
Fundus
Imaging
Optical
Designs
Can’t really use 2018
Pupillary Abnormalities with Varying
Severity of Diabetic Retinopathy
flashes of visible https://doi.org/10.1117/12.2036970
Mukesh Jain, Sandeep Devan, Durgasri Jaisankar,
light Gayathri Swaminathan, Shahina Pardhan & Rajiv Raman

Pupil measurements were performed with an


1/3 inch infrared camera and flash light (10 ws
13 – Xenon flash tube on fundus camera design
Pupil constricts from the flash, and xenon flash lamp, Orion Fundus Camera, http://doi.org/10.1167/iovs.12-10449
dynamic pupillometry can be done with Nidek Technologies, Italy). Kenneth Tran; Thomas A. Mendel; Kristina L. Holbrook; Paul A. Yates

the flash of commercial fundus cameras


(see right →)) Spectral Power Distribution of Canon Speedlite 540EZ consumer Xenon photographic flash

Not a problem unless you always image


through the central 2mm pupil for
example (Maxwellian Optics)

Xenon flashes are typically


powered by capacitor
banks. The more capacitors
involved, the higher the time
SPD at
constant, thus longer the
Full intensity (1/1)
flash duration

http://www.fredmiranda.co
m/forum/topic/1485638
Either 2004
Observation of the ocular fundus by an
March 2016
Fundus Photography in the 21st Century
continuous IR infrared-sensitive video camera after
vitreoretinal surgery assisted by
—A Review of Recent Technological
Advances and Their Implications for
lighting or indocyanine green
https://www.ncbi.nlm.nih.gov/pubmed/12707597
Worldwide Healthcare
https://doi.org/10.1089/tmj.2015.0068
optical design Nishtha Panwar, Philemon Huang, Jiaying Lee, Pearse A. Keane, Tjin Swee
Chuan, Ashutosh Richhariya, Stephen Teoh, Tock Han Lim, and Rupesh
Agrawal

to cope with the Oculus Imagecam 2 Digital Slit Lamp Camera Different segments of the
2008 eye, such as anterior segment, fundus, sclera, etc., can be conveniently

small light- US20100245765A1 Video infrared


imaged by setting suitable exposure time, light magnification, and white
balance. Additional video sequences can be recorded by the high-
resolution, digital video camera in the beam path of the slit lamp.
ophthalmoscope
adapted pupil https://patents.google.com/patent/US20100245765/en Volk Pictor enables nonmydriatic fundus examination with an improved 40°
David S. DYER, James Higgins FoV. It has modifications allowing still images and videos of the optic disc,
Dyer Holdings LLC macula, and retinal vasculature.

The Horus Scope from JedMed (St. Louis, MO) is a hand-held


ophthalmoscopic adaptor for viewing the retina and capturing video and still
images that can be easily transferred to the personal computer

Riester Ri-Screen Multifunctional Digital Camera System This slit


Downside: Near-infrared video does not lamp-based system, along with attachable ophthalmoscopic lens, enables
necessarily capture all the features of the retinal ophthalmic imaging and nonmydriatic eye fundus examination. The
Riester (Jungingen, Germany) Ri-Screen provides digital images and video to
fundus support screening and documentation of ocular lesions and anomalies.

Smarphone-based approach from Harvard Medical School, Boston (


Haddock et al. 2013), They used the iPhone camera's built-in flash for acquiring
images and an external 20D lens for focusing. They used the Filmic Pro app
(£5.99) for control of light intensity, exposure, and focus. A Koeppe lens was
used for imaging patients under anesthesia. Still images were then
retrieved from the recorded video (like with D-Eye). When imaging the
fundus of rabbits, 28D or 30D lenses have shown to give better results.
F-10 confocal digital ophthalmoscope from NIDEK
http://usa.nidek.com/products/scanning-laser-ophtha
lmoscope/
Stripe-field SPIE BiOS, 2014
Non-mydriatic, wide field, fundus video
camera
method https://doi.org/10.1117/12.2036970
Bernhard Hoeher; Peter Voigtmann; Georg Michelson;
Bernhard Schmauss

We describe a method we call "stripe field


imaging" that is capable of capturing wide
field color fundus videos and images of the
human eye at pupil sizes of 2mm.

We designed the demonstrator as a low-cost


device consisting of mass market
components to show that there is no major
additional technical outlay to realize the
improvements we propose. The technical
core idea of our method is breaking the
rotational symmetry in the optical design that is
given in many conventional fundus cameras.

By this measure we could extend the possible


field of view (FOV) at a pupil size of 2mm
from a circular field with 20° in diameter to a
square field with 68° by 18° in size. We
acquired a fundus video while the subject
was slightly touching and releasing the lid. The
resulting video showed changes at vessels in
the region of the papilla and a change of the Stripe-field method: 1st and 2nd Purkinje Reflections
paleness of the papilla. focused to unused lower black stripe; 4 th Purikinje reflection
focused to upper unused stripe; gaining unlimited width of the
field of view in the center
Binocular 2017 SPIE
Binocular video ophthalmoscope
for simultaneous recording of
Opthalmoscope sequences of the human retina to
compare dynamic parameters
https://doi.org/10.1117/12.2282898
Ralf P. Tornow, Aleksandra Milczarek, Jan
Odstrcilik, and Radim Kolar

A parallel video ophthalmoscope was


developed to acquire short video
sequences (25 fps, 250 frames) of both
eyes simultaneously with exact
synchronization.

Video sequences were registered off-line


to compensate for eye movements. From
registered video sequences dynamic
parameters like cardiac cycle
induced reflection changes and eye
movements can be calculated and
compared between eyes.
Concept design of 2018
Korean startup ROOTEE HEALTH
what portable
binocular fundus
camera could
look like
Does not hurt at all to think about the UX
for the end-user (clinician, non-trained
operator who could be the patient or an
optician for example)
“ELI (Eye-Linked-Information), a wearable fundus camera
possesses an auto-shot feature which removes the need for
Naturally this does not exclude the need manually adjusting the camera to focus on the retina. This
for good optical design and good removes the need for patients to undergo taking several photos
computational image enhancement. with flashes. With the use of ELI, patients previously undiagnosed
or lost in between their ‘first’ diagnosis of diabetes and later arising
diabetic complications related to the eye will be possible to
Combine these all into one solution, and prevent. Providing the Internal Medicine department the tool to
you will have a successful business that diagnose diabetic retinopathy is crucial as timing for treatment
must be in early stages. 
brings actual value to the patients instead
of the often over-hyped “startup value” There's a gap between first prototype and ELI. we want to improve
cost-effectiveness and accuracy by using adaptive optics & deep
learning technology”
“Custom”
Fundus
Imaging
Optical
Designs
Fundus imaging 2017 SPIE
Development of a fundus camera for Diagram of the final system.
analysis of photoreceptor directionality
for Stiles- in the healthy retina
http://hdl.handle.net/10362/15618
1 – Illuminator;

Crawford effect Author: Anjos, Pedro Filipe dos Santos;


Advisors: Vohnsen, Brian; Vieira, Pedro
2 – Optical Fibre;

The Stiles-Crawford effect (SCE) is the well-known 3 – Millimetrical Stage;


phenomenon in which the brightness of light
perceived by the human eye depends upon its 4 – Red Filter;
entrance point in the pupil.
5 – Iris Diaphragm;
Retinal imaging, a widely spread clinical practice,
may be used to evaluate the SCE and thus serve as 6 – Maxwellian Lens;
diagnostic tool.
7 – Beamsplitter;
Nonetheless, its use for such a purpose is still
underdeveloped and far from the clinical reality.
In this project a fundus camera was built and used to 8 – Imaging Lens;
assess the cone photoreceptor directionality
by reflective imaging of the retina in healthy 9 – Zoom Lens;
individuals.
10 – Sensor;

11 – Desktop Computer.
“Novel”
Fundus
Imaging
Optical
Designs
a. Trans-epidermal illumination
Inspiration for by means of flexible PCB containing
LEDs placed in contact with the skin
of the eyelid. Light is then transmitted
compact inside the eyeball. After scattering off
the eye fundus, the light passing

ophthalmic through the retinal’s cell layers is


collected by the eye lens. b. Flexible
PCB holding 4 red LEDs. c.

imaging designs Recording and reconstruction


procedure for in-vivo measurement.
d. Experimental setup. The light
scattered from the retina is collected
by lens L1. The 4f system composed
of the lenses L1 and L2 is adjusted for
defocus thanks to a badal system.
Trans-epidermal The lens L2 forms an image of the
pupil plane at its focal distance, while
the lens L3 forms an image of the
illumination retina on the EMCCD camera. Dic:
dichroic mirror. Synchronization
between the LEDs and the camera is
performed thanks to a programmable
board. - Timothé Laforest et al. (2017)
Quantitative phase imaging of retinal cells
https://arxiv.org/abs/1701.08854

By collecting the scattered light through


the pupil, the partially coherent illumination
Illumination of the retinal layers
produces dark field images, which are provided by transcleral
combined to reconstruct a quantitative illumination. The light is first
transmitted through sclera, RPE
phase image with twice the numerical and retina. After travelling through
aperture given by the eye's pupil. We then the aqueous humor it impinges on the
RPE. Here backscattering off the RPE
report, to our knowledge, the very first human in generates a new illumination beam.
vivo phase images of inner retinal cells with high This secondary illumination
provides a transmission light
contrast. propagating through the translucent
layers of the retina which is then
collected by the pupil. Azimuthal
angle θ and polar angle α. and polar angle α..
Trans-palpebral Trans-palpebral illumination: an approach for wide-angle
fundus photography without the need for pupil dilation
illumination Devrim Toslak, Damber Thapa, Yanjun Chen, Muhammet Kazim Erol, R. V. Paul Chan,
and Xincheng Yao https://doi.org/10.1364/OL.41.002688
Optics Letters Vol. 41, Issue 12, pp. 2688-2691 (2016)

“Retinal field-of-view (interior angle of 152°,


Paper 1 and exterior angle 105°”

Digital super-resolution
algorithms are also being
considered for further resolution
improvements [Thapa et al. 2014]. In
addition to the smartphone-
based prototype imaging device,
we are currently constructing a
benchtop prototype for testing
the feasibility of wide-angle
fluorescein angiography
employing the trans-palpebral
illumination
Trans-pupillary Near-infrared light-guided miniaturized indirect
ophthalmoscopy for nonmydriatic wide-field fundus
illumination photography
Devrim Toslak, Damber Thapa, Yanjun Chen, Muhammet Kazim Erol, R. V. Paul Chan,
and Xincheng Yao https://doi.org/10.1364/OL.41.002688
Optics Letters Vol. 41, Issue 12, pp. 2688-2691 (2016)

Paper 2
Trans-pars- Contact-free trans-pars-planar illumination enables snapshot fundus camera for nonmydriatic wide feld
photography
planar Benquan Wang, Devrim Toslak, Minhaj Nur Alam, R. V. Paul Chan & Xincheng Yao https://doi.org/10.1038/s41598-018-27112-x
Scientific Reports volume 8, Article number: 8768 (2018)

illumination Panoret-1000™ employed trans-scleral illumination to image the retina from the optic disc to the ora serrata in a single-shot image (Shields et al. 2003). However, clinical deployment of trans-scleral
illumination was not successful, and the product Panoret-1000™ is no longer commercially available. Clinical deployment of trans-scleral illumination failed due to several limiting factors. First,
the employed contact-mode imaging was not favorable for patients. Direct contact of the illuminating and imaging parts to the eyeball might produce potential inflammation, contamination,
and abrasion to the sclera and cornea. Second, it was difficult to operate the system to obtain good retinal images. In Panoret-1000™, the digital camera and light illuminator were apart from
each other. To capture a retinal image, one hand was used to operate the camera, and the other hand was used to adjust the light illuminator. The simultaneous need of both hands for imaging
operation made the device very difficult to use.

Instead of using a light illuminator


contacting the eyelid (trans-palpebral
illumination)13 or sclera (trans-scleral
illumination)10,11 trans-pars-planar
illumination is totally contact-
free to project illuminating light
through the pars plana. 

Representative fundus images with illumination at different locations. (a) Illustration of different
illumination locations. (b) Fundus images acquired at different illumination locations. b1-b3 were acquired at
corresponding locations P1-P3 in panel a. (c) Average intensity of fundus images collected with constant power
illumination delivered through different locations. The curve is an average of 5 trials from one subject. Gray shadow
shows standard deviation. P1-P3 corresponds to images b1-b3. (d) Red, green and blue channels of the fundus
image b2. (e) Normalized fundus image b2, with digital compensation of red and green channel intensities. Macula,
optic disc, nerve fiber bundles and blood vessels could be clearly identified.
Retinal Imaging Schematic illustration of different illumination schemes for retinal imaging.

Illumination

Summary
https://doi.org/10.1038/s41598
-018-27112-x

trans-pupillary trans-scleral trans-palpebral trans-pars-planar


illumination illumination illumination illumination

 A lens was used to image the aperture onto the


sclera to form an arc-shaped illumination
pattern. The illumination aperture was
carefully designed to closely match the shape
of the pars plana. The end of the illuminating
arm that was close to eye could be manually
moved in a horizontal direction by a
translation stage to precisely deliver
illumination light to the pars plana. Light passing
through the pars plana was diffused and
illuminated the intraocular area homogenously. 
Transcranial
Illumination
Artifacts caused by retinal surface reflex are often
encountered, which complicate quantitative
interpretation of the reflection images.We present an
alternative illumination method, which avoids
these artifacts. The method uses deeply penetrating
near-infrared (NIR) light delivered transcranially
from the side of the head, and exploits multiple
scattering to redirect a portion of the light towards the
posterior eye

August 2018
Non-mydriatic chorioretinal imaging
in a transmission geometry and
application to retinal oximetry
doi: 10.1364/BOE.9.003867
Timothy D. Weber and Jerome Mertz
Boston University A: Schematic of fundus transillumination and imaging. LEDs at several central wavelengths ( λ N ) are imaged via coupling
Project: Transcranial retinal imaging optics (CO), comprised of lenses f 1 and f 2 , onto the proximal end of a flexible fiber bundle (FB). A commerical fundus camera
(FC) images the transilluminated fundus onto a camera (CCD). B: Example raw image recorded on the CCD. C: Normalized
measured spectra of available high-power deep red and NIR LEDs.

This unique transmission geometry simplifies absorption measurements and enables flash-free, non-mydriatic
imaging as deep as the choroid. Images taken with this new transillumination approach are applied to retinal
oximetry.
Aplanat June 2018
Fundus Imaging using Aplanat
https://doi.org/10.1080/24699322.2017.1379143
Fundus Vishwanath Manik Rathod, M.Sc. Thesis
Indian Institute of Technology Hyderabad Steps in Solid Edge

Imaging In this thesis, we suggest an alternative optics


for fundus imaging. Design constraints of
In order to image retina completely, 3 phases of imaging need to be done.
aplanat helps to remove all major Narrow field aplanat will be used to image the part near to optical axis of eye.
aberrations observed by lens without adding Wide throat aplanat will be used to image peripheral region. Hole at center
any extra corrective measures as like those of remained undetected through aplanat can be imaged using normal lens system.
lens optical systems. And since the proposed So final solution is to image:
system does not have complex set of lenses,
complications of the system are and helps in Exploiting the
overlapping part in the
reduction of cost significantly. images from all the
system, stitching
algorithm can be used
Major advantage of the system is it offers wide later to form a
numerical aperture large field of view and complete image. This
system leads to total
system size remains to that of hand held device. FOV of 200˚
Large NA and high radiation efficiency abolish
the need of pupil dilation making process
painless for patient.

Process follows as coordinates in MATLAB exported to Solid Edge


where aplanat reflector in CAD object form was made then
imported in Zemax. Zemax supports four CAD formats: STL, IGES,
STEP and SAT. Of these, only STL uses facets to represent the
object: the other three model the object as a smooth, continuous
surface shape.
Freeform Optics May 2018
Starting geometry creation and design
May 2018
Design of Freeform Illumination Optics
Fundus method for freeform optics
https://doi.org/10.1038/s41467-018-04186-9
https://doi.org/10.1002/lpor.201700310
Rengmao Wu Zexin Feng Zhenrong Zheng Rongguang Liang
Pablo Benítez Juan C. Miñano, Fabian Duerr
Imaging? Aaron Bauer, Eric M. Schiesser & Jannick P. Rolland

Review focuses on the design of freeform illumination optics, which


May 2018 is a key factor in advancing the development of illumination
engineering.
Over-designed and under-performing:
design and analysis of a freeform prism May 2018
via careful use of orthogonal surface High-performance integral-imaging-
descriptions based light field augmented reality
https://doi.org/10.1117/12.2315641
Nicholas Takaki; Wanyue Song; Anthony J. Yee; Julie Bentley; display using freeform optics
Duncan Moore; Jannick P. Rolland https://doi.org/10.1364/OE.26.017578
- Institute of Optics, Univ. of Rochester Hekun Huang and Hong Hua

http://focal.nl/en/#technology

Medical Optical Systems


DEMCON Focal
Institutenweg 25A
7521 PH Enschede
Freeform Optics 2017
Freeform optical design for a “The lateral resolution on the cornea is about 10 μCube: A Framework for 3D Printable Optomechanicsm

for Corneal nonscanning corneal imaging system


with a convexly curved image
with good modulation transfer function (MTF) and spot
performance. To ease the assembly, a monolithic design is
https://doi.org/10.1364/AO.56.005630 achieved with slightly lower resolution, leading to a potential
Imaging Yunfeng Nie, Herbert Gross, Yi Zhong, and Fabian Duerr mass production solution.”
Additive June 2018
3D printed photonics and free-form
Manufacturing optics
http://www.uef.fi/en/web/photonics/3d-printed-pho
Optics tonics-and-free-form-optics
http://optics.org/news/4/6/8
Jyrki Saarinen, Jari Turunen (design), Markku Kuittinen, Anni
Fundus Eronen, Yu Jiang, Petri Karvinen, Ville Nissinen, Henri Partanen,
Pertti Pääkkönen, Leila Ahmadi, Rizwan Ali, Bisrat Assefa, Olli
Ovaskainen, Dipanjan Das, Markku Pekkarinen
Imaging?
Dutch start-up LUXeXceL  has invented the
Printoptical® technology for 3D printing optical
elements. Their technology is based on an

3D Printing optics inkjet printing process. In collaboration with


Luxexcel, University of Eastern Finland will
further develop the Printoptical® technology.
in other words
June 2016 Feb 2018
Additive manufacturing of optical Additive manufacturing of reflective and transmissive
components optics: potential and new solutions for optical systems
https://doi.org/10.1515/aot-2016-0021 https://doi.org/10.1117/12.2293130
Andreas Heinrich / Manuel Rank / Philippe Maillard / Anne Suckow / Yannick A. Heinrich; R. Börret; M. Merkel; H. Riegel
Bauckhage / Patrick Rößler / Johannes Lang / Fatin Shariff / Sven Pekrul

Additive manufacturing enables the realization of complex shaped parts. This also provides a high
The additive manufacturing technology offers a high potential in the potential for optical components. Thus elements with virtually any geometry can be
field of optics as well. Owing to new design possibilities, completely realized, which is often difficult with conventional fabrication methods. Depending on the
new solutions are possible. This article briefly reviews and material and thus the manufacturing method used, either transparent optics or reflective optics
compares the most important additive manufacturing methods for can be developed with the aid of additive manufacturing.
polymer optics. Additionally, it points out the characteristics of
additive manufactured polymer optics. Thereby, surface quality
is of crucial importance. In order to improve it, appropriate post- Our aim is to integrate the additive manufactured optics into optical systems.
processing steps are necessary (e.g. robot polishing or coating), Therefore we present different examples in order to point out new possibilities and new solutions
which will be discussed. An essential part of this paper deals with enabled by 3D printing of the parts. In this context, the development of 3D printed reflective and
various additive manufactured optical components and their use, transmissive adaptive optics will be discussed as well.
especially in optical systems for shape metrology (e.g. borehole
sensor, tilt sensor, freeform surface sensor, fisheye lens). The
examples should demonstrate the potentials and limitations of
optical components produced by additive manufacturing.
Depth-
resolving
Fundus
Imaging
Optical
Designs
Fundus 2008
Quantitative depth analysis of optic
March 2014
3D papillary image capturing by the stereo
nerve head using stereo retinal fundus fundus camera system for clinical diagnosis
Stereo image pair
http://doi.org/10.1117/1.3041711
on retina and optic nerve
https://doi.org/10.1117/12.2038435
Imaging Toshiaki Nakagawa,Takayoshi Suzuki,Yoshinori Hayashi, Tetsuya
Yamamoto et al..
Danilo A. Motta; André Serillo; Luciana de Matos; Fatima M. M. Yasuoka;
Vanderlei Salvador Bagnato; Luis A. V. Carvalho

2012
Quantitative Evaluation of Papilledema from
Stereoscopic Color Fundus Photographs
http://doi.org/10.1167/iovs.12-9803
Li Tang; Randy H. Kardon; Jui-Kai Wang; Mona K. Garvin; Kyungmoo Lee;
Michael D. Abràmoff

Convergent visual system for depth calculation of


stereo image pair. 
Comparison of ONH shape estimates from stereo fundus
photographs and OCT scans using topographic maps. (A) Reference
(left) fundus image wrapping onto reconstructed topography as output
from stereo photographs. Small squares with different colors are marked
at four corners of the reference image, indicating the orientation of retinal
surface rendering. (B) Fundus image wrapping onto reconstructed
topography as output from the OCT scans from the same view angle.
Fundus 2015
All-in-focus imaging technique used to improve 3D retinal fundus image reconstruction
https://doi.org/10.1145/2695664.2695845 | https://doi.org/10.1117/12.2038435
Stereo Danilo Motta, Luciana de Matos, Amanda Caniatto de Souza, Rafael Marcato, Afonso Paiva, Luis Alberto Vieira de
Carvalho R&D – Wavetek; Universidade de São Paulo, São Carlos, SP, Brazil

Imaging #2
Snapshot 2017
Design of optical system for binocular
fundus camera
stereo fundus https://doi.org/10.1080/24699322.2017.1379143
Jun Wu, Shiliang Lou, Zhitao Xiao, Lei Geng, Fang Zhang, Wen

systems Wang & Mengjia Liu

A non-mydriasis optical system for binocular


fundus camera has been designed in this
paper. It can capture two images of the same
fundus retinal region from different angles at the
same time, and can be used to achieve three- Diagram of imaging system.
dimensional reconstruction of fundus.

According to requirements of
medical light source, sodium
lamp whose wavelength is 589
nm is selected as light source
and its spectral range is 560-
610 nm; Magnifying power of
the imaging system is 1.07, and
the cut-off frequency of object
is 96.3pl/mm, that is our
system can distinguish the
structure unit of 5.2 μm. In
order to make operation and
adjustment more convenient,
the size of this optical system is
set to be 480 mm x 100 mm x
200 mm.
3D fundus with 2018
3D Image Reconstruction of Retina
using Aplanats
aplanats http://raiith.iith.ac.in/id/eprint/4109
Sankrandan Loke and Soumya Jana
Masters thesis, Indian Institute of Technology Hyderabad

The ray tracing program is written in Python, with


assistance from the MATLAB toolbox Optometrika

3D eye model

A very high resolution 3D retina is constructed using the Meshlab software.


This is considered as the digital version of painted retina to be imaged. A
hemi-spheroidal shaped, high density point cloud is created and normals are
calculated at each point. Then ”screened Poisson Surface Reconstruction”
filter is applied on it to create a mesh over the point cloud. The resultant mesh
is cleaned and the face-normals and vertex-normals are normalized.

3D plot of eye, aplanat and its sensor


Plenoptic
Fundus
Imaging
Optical
Designs
Plenoptic Fundus 2011
US20140347628A1
Multi-view fundus camera
US8998411B2US8998411B2: ”As described by Ren Ng (founder of Lytro),
the “light field” is a concept that includes both the position and
direction of light propagating in space (see for example
Imaging
Inventor: ASOCIACION INDUSTRIAL DE OPTICA,
COLOR E IMAGEN - AIDO;Universtitat de Valencia U.S. Pat. No. 7,936,392). “
Current Assignee ASOCIACION INDUSTRIAL DE
OPTICA COLOR E IMAGEN ASOCIACION INDUSTRIAL
DE OPTICA COLOR E IMAGEN - AIDO Universtitat de
Valencia

2011
Idea been around for US8814362B2
Method for combining a plurality of
eye images into a plenoptic
some time now multifocal image
Inventor: Steven Roger Verdooner 
Current Assignee : Neurovision Imaging Inc 

2011
US8998411B2
Light field camera for fundus
photography
Inventor: Steven Roger Verdooner Alexandre R.
Tumlinson, Matthew J. Everett
Current Assignee : Carl Zeiss Meditec Inc

2013
US9060710B2I DeHoog and Schwiegerling
System and method for ocular
tomography using plenoptic imaging
Inventor Richard J. Copland “Fundus camera
Current Assignee AMO Wavefront Sciences LLC systems: a comparative
analysis”
2015
US9955861B2 Appl. Opt. 48, 221-228
Construction of an individual eye (2009)
https://doi.org/10.1364/AO.4
model using a plenoptic camera 8.000221
Inventor Liang Gao, Ivana Tosic
Current Assignee Ricoh Co Ltd
Plenoptic Fundus Plenoptic imaging of the retina:
can it resolve depth in scattering
“Plenoptic imaging has already proven its capabilities to determine depth and give 3D
topographic information in free space models, however no study has shown how it
would perform through scattering media such as the retina. In order to study
tissues? this, simulations were performed using MCML, a multi-layered Monte Carlo modeling

Imaging Richard Marshall, Iain Styles, Ela Claridge, and Kai


Bongs
https://doi.org/10.1364/BIOMED.2014.BM3A.60, PDF
software [Wang et al. 1995]. The parameters characterising the properties of retinal layers
and used in Monte Carlo (MC) simulation have been taken from Styles et al. (2006).”

Configurations of two
different plenoptic
Short intro cameras: (a) The
traditional plenoptic
camera. (b) The
focused plenoptic
camera.

Simulation of Light Field


Fundus Photography
Sha Ton and T.J. Melanson.
Stanford Course assignment
http://stanford.edu/class/ee367/Winter
2017/Tong_Melanson_ee367_win17_rep
ort.pdf

Comparisons between normal camera (L), light field


camera (M) and reference image (R)

Light Field Images from


Different Viewing Points
Plenoptic Fundus Retinal fundus imaging with a plenoptic sensor
Brice Thurin; Edward Bloch; Sotiris Nousias; Sebastien Ourselin;
Pearse Keane; Christos Bergeles

Imaging
https://doi.org/10.1117/12.2286448

Prototype System #1:


Moorfields Eye Hospital and
University College of London

Optical layout of the plenoptic fundus camera. A white LED


illuminates the eye fundus through a polarizer and polarizing
beamsplitter. The LED chip is conjugated with the eye pupil and an IRIS.
While the condensing lens L1 is conjugated with the retinal plane.A
primary image of the retinal is formed by a Digital Wide-Field lens L4.
This image is relayed to the plenoptic sensor (Raytrix R8) by L3 and L6
through the polarizing beamsplitter. The polarization helps reduce the
corneal reflex.

Plenoptic ophthalmoscopy has been


considered for eye examinations [Tumlinson and Everett 2011;
Bedard et al. 2014; Lawson and Raskar 2014]
A crude implementation
has been proposed by Adam et al. (2016) the
system built is used as a substitute for a human
observer, only a small portion of the field is used for
fundus imaging and the it does not exploit the full
capabilities of light-field imaging. More recently a
plenoptic sensor has been used to successfully
characterize the topography of the healthy and
diseased human iris in vivo [Chen et al. 2017].
Plenoptic Fundus Glare-free retinal imaging using a portable light
field fundus camera
Douglas W. Palmer, Thomas Coppin, Krishan Rana, Donald G. Dansereau, Marwan Suheimat, Michelle

Imaging
Maynard, David A. Atchison, Jonathan Roberts, Ross Crawford, and Anjali Jaiprakash
Biomedical Optics Express Vol. 9, Issue 7, pp. 3178-3192 (2018)
https://doi.org/10.1364/BOE.9.003178

Imaging path optical diagram of light field fundus camera.


Top row (A,B,C) represents a correctly designed system where
the entrance pupil diameter ØLF is smaller than the eye pupil ØP
and the region of the sensor under the microlens shows minimal

Prototype System #2a:


vignetting, where d is the number of pixels under a microlens
horizontally and vertically. Bottom row (D,E,F) represents an
incorrectly designed system where ØLF is larger than ØP. The
Queensland University of Technology; Medical and
resultant micro image vignetting is shown in (F).(A) and (D) show
Healthcare Robotics, Australian Centre for slices taken approximately through the iris of the eye. (B) and (D)
Robotic Vision, Brisbane; Institute of Health and show the arrangements of components and paraxial
Biomedical Innovation, Brisbane approximations of the ray paths for a point on (blue) and off-axis
(red).The design entrance and exit pupils are the images of the
design aperture stop as seen through the objective and relay lenses
respectively.

Plenoptoscope - General arrangement. Imaging path in gray, eye fixation path in


red, illumination path in yellow. The Lytro Illum light field camera has an internal fixed
f/2.0 aperture stop (not shown).
Plenoptic Fundus Glare-free retinal imaging using a portable light
field fundus camera
Douglas W. Palmer, Thomas Coppin, Krishan Rana, Donald G. Dansereau, Marwan Suheimat, Michelle

Imaging
Maynard, David A. Atchison, Jonathan Roberts, Ross Crawford, and Anjali Jaiprakash
Biomedical Optics Express Vol. 9, Issue 7, pp. 3178-3192 (2018)
https://doi.org/10.1364/BOE.9.003178

Series of images captured using the Retinal


Plenoptoscope. Images are shown in sets of two with
the top image being a standard (not glare-free)
render of the light field, and the bottom image being
Prototype System #2b: a gray-scale relative depth map. Each depth map has
an associated scale that relates gray shade to depth.
Queensland University of Technology; Medical and Note that the leftmost set is of a model eye, the second
Healthcare Robotics, Australian Centre for leftmost set is of a myopic eye (-5.75D), and the two
Robotic Vision, Brisbane; Institute of Health and rightmost sets are of emmetropic eyes.
Biomedical Innovation, Brisbane

An image of a human retina captured using the Retinal


Plenoptoscope with an associated epipolar image.
Annotations indicate areas of interest, where (A) and (C)
correspond to glare, and (B) corresponds to the Optic Disk.

Series of images created using various light field rendering techniques. Images
are shown in sets of three with the left image being the central view from the
light field, the middle image being a standard render with no glare masking, and
the right image being a glare-free render.
Plenoptic Iris Human iris three-dimensional imaging at
micron resolution by a micro-plenoptic camera
Hao Chen, Maria A. Woodward, David T. Burke, V. Swetha E. Jeganathan, Hakan Demirci, and Volker Sick

Imaging
Biomedical Optics Express Vol. 8, Issue 10, pp. 4514-4522 (2017)
https://doi.org/10.1364/BOE.8.004514 | researchgate

Prototype System
Mechanical Engineering; Department of
Ophthalmology and Visual Sciences; Department of
Human Genetics | University of Michigan

A micro-plenoptic system (Raytrix R29) was designed to capture the


three-dimensional (3D) topography of the anterior iris surface by simple
single-shot imaging. Within a depth-of-field of 2.4 mm, depth resolution of
10 µm can be achieved with accuracy (systematic errors) and precision
(random errors) below 20%. 
Multimodal
Imaging
Optical
Designs
Combined 2007
Simultaneous Fundus Imaging and
Optical Coherence Tomography of the
Fundus and OCT Mouse Retina
http://doi.org/10.1167/iovs.06-0732
Imaging Omer P. Kocaoglu; Stephen R. Uhlhorn; Eleut Hernandez;
Roger A. Juarez; Russell Will; Jean-Marie Parel; Fabrice
Manns

To develop a retinal imaging system suitable for


routine examination or screening of mouse
models and to demonstrate the feasibility of
simultaneously acquiring fundus and optical
coherence tomography (OCT) images.
The mouse was held in a cylindrical holder made from 30-mL syringes. The position of the mouse was adjusted
to align the optical axis of the mouse eye with the axis of the delivery system by using 6-μm screws.m screws.

Left: general optical design of the imaging system; right: the mouse fundus and OCT imaging system, including fundus imaging with a digital camera attached to the
photographic port of the slit lamp, the OCT beam delivery system, the six-axis mouse positioner, and the interferometer.
Fundus Camera 2013
Fundus Camera Guided Photoacoustic
Ophthalmoscopy
Guided https://doi.org/10.3109/02713683.2013.815219

Photoacoustic A schematic of the imaging system designed


and optimized for rat eyes is shown in right →
Ophthalmoscopy A 532-nm pulsed laser (Nd:YAG laser,
SPOT-10-100, Elforlight Ltd, UK; output
wavelength 1064 nm; pulse duration: 2 ns;
BBO crystal for second harmonic frequency
generation; CasTech, San Jose, CA) was used
as the illumination light source for PAOM.

The PAOM laser (green path) was scanned by


an x–y galvanometer (QS-7, Nutfield
Technology) and delivered to the posterior
segment of the eye after passing through a
relay lens L5 (f = 150 mm) and an objective lens
OBJ1 (f = 30 mm, VIS-NIR AR coated). The final
laser pulse energy on the cornea is 60 nJ,
which is considered eye-safe.

The induced PA waves were detected by a


custom-built unfocused needle
ultrasonic transducer (central frequency
35 MHz; bandwidth: 50%; active element size:
0.5 x0.5 mm2). The ultrasonic transducer was
gently attached to the eye lid (close to the
canthus) coupled by a thin layer of medical-
grade ultrasound gel.
Infrared 2014
Infrared Retinoscopy
http://dx.doi.org/10.3390/photonics1040303
Retinoscopy
Retinoscopy could be a more effective and
versatile clinical tool in observing a wide range
of ocular conditions if modifications were made
to overcome the inherent difficulties. In this
paper, a laboratory infrared retinoscope
prototype was constructed to capture the
digital images of the pupil reflex of various
types of eye conditions.

The captured low-contrast reflex images due to


intraocular scattering were significantly
improved with a simple image processing
procedure for visualization. Detections of
ocular aberrations were demonstrated, and
computational models using patients’
wavefront data were built to simulate the
measurement for comparison.

The simulation results suggest that the retinal


stray light that is strongly linked to intraocular
scattering extend the detection range of
illuminating eccentricity in retinoscopy and
make it more likely to observe ocular
aberrations.
Light Levels
Maximum intensities
limited by what is
safe for human eye
Wang et al. (2018) Kim, Delori, Mukai (2012):
ISO 15004-2.2 https://doi.org/10.1038/s41598-018-27112-x Smartphone Photography Safety
https://www.aaojournal.org/article/S0161-6420(12)00410-1/pdf

Standard for safe “According to the ISO 15007-2:2007 (Petteri: The light safety limits for ophthalmic instruments set by the International
Incorrect standard reference) standard, a Organization for Standardization (ISO 15004-2.2) recommend that spectral
retinal irradiance with maximum of 10  J/cm2 weighted irradiance (W/cm2/nm) on the retina be weighted separately for thermal and
photochemical hazard functions or action spectra. These safety limits are at least 1
ophthalmic instruments irradiance is allowed on the retina without
photochemical hazard concern.”
order of magnitude below actual retinal threshold damage [Delori et al. 2007; Sliney et al. 2002].
The radiant power of the smartphone was 8 mW.

For thermal hazard, the weighted retinal irradiance for the smartphone was 4.6
mW/cm2, which is 150 times below the thermal limit (706 mW/cm 2). For
Used e.g. by photochemical hazard, the weighted retinal radiant exposure was 41 mJ/cm2
(exposure duration of 1 minute), which is 240 times below the photochemical limit
Kölbl et al. (2015) (10 J/cm2). Since the light safety standards not only account for the total retinal
irradiance but also for spectral distribution, we measured the latter with a
Sheng Chiong Hong (2015) spectroradiometer (USB4000, Ocean Optics, Dunedin, FL). The radiation was
limited to the 400–700 nm wavelength interval with about 70% of that light emitted
in the blue and green part of the spectrum (wavelength 600 nm).
for discussion on limits, see We then compared the light levels produced during smartphone fundoscopy
Sliney et al. (2005) with those produced by standard indirect ophthalmoscopes. The retinal
irradiance produced by a Keeler Vantage Plus LED (Keeler Instruments Inc.,
Broomall, PA), measured using identical procedures as earlier described, was 46
mW/cm2 or about 10 times the levels observed with the smartphone. This
Hazard weighting functions according to the finding corresponds well with retinal irradiances of 8 to 210 mW/cm 2 found in other
DIN EN ISO 15007 – 2: 2014 standard A(λ) studies for a wide selection of indirect ophthalmoscopes. The spectral distribution
and R(λ). A(λ) rates the photochemical and of the Keeler indirect ophthalmoscope was similar to that of the smartphone (both
have LED sources). The weighted exposures for the Keeler indirect
A(λ) rates the thermal hazard for all kinds of ophthalmoscope were thus 15 and 24 times less than the limits for thermal and
light sources. Kölbl et al. (2015) photochemical hazards, respectively. The lower light level available for
observation using the smartphone, as opposed to the indirect ophthalmoscope, is
largely compensated for by the high electronic sensitivity of the camera.

In conclusion, retinal exposure from the smartphone was 1 order of


magnitude less than that from the indirect ophthalmoscope and both are
within safety limits of thermal and photochemical hazards as defined by the ISO
when tested under conditions simulating routine fundoscopy.
2018
Example of Contact-free trans-pars-planar illumination enables snapshot
fundus camera for nonmydriatic wide field photography
Calculation https://doi.org/10.1038/s41598-018-27112-x

The thickness of the sclera is ~0.5 mm Olsen et al. 1998. The Kölbl et al. (2015)
transmission of the sclera in visible wavelength is 10–30% light source a round shaped white LED is used. By integration of

For calculating retinal Vogel et al. 1991


. To be conservative, 30% was used for calculation. the light source into a speculum, the LED is pressed firmly held
against the sclera. Thus the ocular space is illuminated
transsclerally. As a result an indirect uniform illumination of the

irradiance from quasi- For the proof-of-concept experiment, the weighted


irradiance on the sclera was calculated to be 0.5 mW, the
complete intraocular space is achieved.

monochromatic green area of the arc-shaped light was 13 mm2. For the worst case
estimation, we assumed all illumination light directly expose to
(565 nm) pars planar the retinal area behind the illuminated sclera area. Therefore,
the maximum allowed exposure time is
illumination

If the illumination light accidently fell into the pupil, the


illuminated area on retina was estimated to be >9 mm2. Thus
the maximum allowed exposure time through the pupil is >30 
minutes. For thermal hazard, the maximum weighted power
intensity allowed on the sclera without thermal hazard
concern is 700 mW/cm2. The calculated weighted power
intensity was 230 mW/cm2, which was more than three times
lower than the maximum limit.
2018
Example of the Use of a supercontinuum white light in evaluating the spectral
sensitivity of the pupil light reflex
use of Catherine Chin; Lasse Leick; Adrian Podoleanu; Gurprit S. Lal
Univ. of Kent (United Kingdom); NKT Photonics A/S (Denmark)
https://doi.org/10.1117/12.2286064

Supercontinuum
light source Light was generated
by the NKT Photonics
SuperK Extreme EXR
We assessed the spectral sensitivity of the and filtered through
Extended UV (480
pupillary light reflex in mice using a high nm) and SuperK
power supercontinuum white light Select (560 nm)
modules. 
(SCWL) source in a dual wavelength
configuration. This novel approach was
compared to data collected from a more The use of a SCWL is a significant leap
forward from the Xenon arc light
traditional setup using a Xenon arc lamp traditionally used in recording pupillary light
fitted with monochromatic interference responses. The SCWL gives the
filters. experimenter much more control over the
light stimulus, through wavelength, intensity
and, most importantly, a dual light
configuration.

Together, this will allow more complex


lighting protocols to be developed that
can further assist in unraveling the complex
coding of light that gives rise to the pupil light
reflex and other photic driven physiological
responses
Image
Processing
Intro for compensating
low image quality
computationally
Fundus Video 2014
Multi-frame Super-resolution with
2014
Blood vessel segmentation in video-
Quality Self-assessment for Retinal sequences from the human retina
Processing Fundus Videos
https://doi.org/10.1117/12.2036970
https://doi.org/10.1109/IST.2014.6958459
J . Odstrcilik ; R. Kolar ; J. Jan ; R. P. Tornow ; A. Budai
Thomas Köhler, Alexander Brost, Katja Mogalle, Qianyi Zhang, Christiane
Köhler, Georg Michelson, Joachim Hornegger, Ralf P. Tornow This paper deals with the retinal blood vessel
segmentation in fundus video-sequences acquired by
In order to compensate heterogeneous illumination on the experimental fundus video camera. Quality of acquired
fundus, we integrate retrospective illumination correction video-sequences is relatively low and fluctuates across
for photometric registration to the underlying imaging particular frames. Especially, due to the low resolution,
model. Our method utilizes quality self-assessment to poor signal-to-noise ratio, and varying illumination
provide objective quality scores for reconstructed images conditions within the frames, application of standard
as well as to select regularization parameters image processing methods might be difficult in such
automatically. In our evaluation on real data acquired from experimental fundus images.
six human subjects with a low-cost video camera, the
proposed method achieved considerable enhancements
of low-resolution frames and improved noise and
sharpness characteristics by 74%. 2016
Registration of retinal sequences from
2014 new video-ophthalmoscopic camera
Geometry-Based Optic Disk Tracking in https://doi.org/10.1186/s12938-016-0191-0
Radim Kolar, Ralf. P. Tornow, Jan Odstrcilik and Ivana Liberdova
Retinal Fundus Videos
https://doi.org/10.1007/978-3-642-54111-7_26 Analysis of fast temporal changes on retinas has become
Anja Kürten, Thomas Köhler, Attila Budai, Ralf-Peter Tornow, Georg Michelson, an important part of diagnostic video-ophthalmology.
Joachim Hornegger
It enables investigation of the hemodynamic processes in
retinal tissue, e.g. blood-vessel diameter changes as a
Fundus video cameras enable the acquisition of image result of blood-pressure variation, spontaneous venous
sequences to analyze fast temporal changes on the pulsation influenced by intracranial-intraocular pressure
human retina in a non-invasive manner. In this work, we difference, blood-volume changes as a result of changes in
propose a tracking-by-detection scheme for the optic disk light reflection from retinal tissue, and blood flow using
to capture the human eye motion on-line during
laser speckle contrast imaging. For such applications,
examination. Our approach exploits the elliptical shape of
image registration of the recorded sequence must be
the optic disk.
performed.
Multiframe July 2018
Fundus photography with subpixel
registration of correlated laser speckle
registration images
https://doi.org/10.7567/JJAP.57.09SB01
Jie-En Li and Chung-Hao Tien

Schematic diagram of the


optics of fundus
photography. LS, light source;
CL, collector lens; AD,
aperture diaphragm; BS,
beam splitter; FD, field
diaphragm; Ln, lenses. In our
experiment, the focal lengths
of the lenses are 30 and 100
mm (f1 = 300 mm and f2 =
100 mm).

Images obtained bylaser


speckle contrast imaging
(LSCI) (a) without image
registration and (b) with Images of rabbit retina with (a)
image registration incoherent illumination, (b)
process. (c) Speckle coherent illumination, and (c) LSCI
contrast of (a) and (b) along image. Vessels enclosed by the red
frame are barely distinguishable in
the red lines. images obtained using a conventional
fundus system, but are enhanced
when their images are obtained with
the help of LSCI.
Future of
Fundus
Imaging
Hardware becoming a
low-cost commodity
with the value of easy
data acquisition
increasing
Google 2018
Optical Systems Engineer
2018
Electrical Engineer,
2018
Technical Lead,
Verily Life Sciences Ophthalmic Devices Ophthalmic Devices and
building a https://www.linkedin.com/jobs/view/674965920

Responsibilities
Verily Life Sciences
https://www.linkedin.com/jobs/view/692175458
Electroactive Polymers
Verily Life Sciences
Design state-of-the-art optics-based https://www.linkedin.com/jobs/view/technical-lead
Responsibilities

-ophthalmic-devices-and-electroactive-polymers-
devices. at-verily-life-sciences-731643271/

Working with cross-functional teams to

Work closely with interdisciplinary team to define electronic systems based on system- Responsibilities
integrate optical designs into level requirements and tradeoffs
prototypes and product development

Leading the development of new
path.

Design electronic systems for highly ophthalmological devices and
miniaturized electronic devices, diagnostic instrumentation.
Minimum Qualifications especially at the PC board level ●
Early-stage hardware and systems

PhD in Optical Engineering/Optics/Physics/ ●
Identification, selection and qualification of development.
EE, or related technical field or equivalent key electronic components for electronic
practical experience. Minimum Qualifications
systems in miniaturized medical devices
Knowledge and experience in optical

Experience in electroactive polymer
Identification and qualification of key

design, optics and imaging systems design. applications in optical systems.


vendors and partners for electronics

Applied research experience in integration and manufacture ●
Experience with embedded systems
physics/optics/imaging systems. hardware/software development.

Rapid prototyping of electronic
● systems with in-house teams and with

Experience with different stages of product
support from vendors, including integration development: proof-of-concept, prototyping,
Preferred Qualifications EV and DV builds.
Lily Peng – Google Brain ●
Experience in opto-mechanical design
with a variety of electrical and mechanical
components Preferred Qualifications
Talking about deep learning for fundus ●
Experience in electronics (PCB schematic Minimum Qualifications ●
Experience with opto-mechanical
capture and layout, soldering, etc.) systems and components. Experience
images (diabetic retinopathy), and their ●
Programming experience in MATLAB/C/C+

MS degree in Electrical Engineering or
related major plus 4 years work experience with ophthalmic devices.
collaboration with Arvind institute in India at +/Python ●
Solid analog and digital circuit design ●
Background in both hardware and
software. Programming experience with
APTOS 2018 ●
Experience with microcontrollers skill
C++ and Python.

Excellent communication and collaboration ●
Experience with complex PC board design
skills. ●
Experience in optics and electronics with a

Experience with firmware development
focus on optical/spectral
imaging/sensing technologies.

Product development track record in optical
applications of electroactive polymers
or similar.

Experience with Camera, Image Signal
Processor (ISP) or camera software
stack. Knowledge of signal processing,
digital imaging, computer vision, and image/
video processing. Experience with SoC,
camera modules, VR/AR display
techniques.

Experience with (real time) data processing
Portable 05 Sep 2018
Adaptive optics reveals fine details of
2018
Handheld adaptive optics scanning
retinal structure Duke University handheld AOSLO laser ophthalmoscope
adaptive optics platform assists diagnosis of eye disease and trauma
http://optics.org/news/9/9/5
https://doi.org/10.1364/OPTICA.5.001027
http://people.duke.edu/~sf59/HAOSLO.htm
for improving "To overcome the weight and size restrictions in
Theodore DuBose, Derek Nankivil, Francesco LaRocca, Gar Waterman,
Kristen Hagan, James Polans, Brenton Keller, Du Tran-Viet, Lejla Vajzovic,
Anthony N. Kuo, Cynthia A. Toth, Joseph A. Izatt, and Sina Farsiu

image quality integrating AOSLO into handheld form (weighing


less than 200g), we used recent advancements in
the miniaturization of deformable mirror
technology and 2D microelectromechanical
systems (MEMS) scanning, together with a novel
wavefront sensorless AO algorithm and a custom
optical and mechanical design," commented the
team in the Optica paper.

The new probe and associated numerical methods


could be useful for a variety of applications in
ophthalmic imaging, and the Duke team has made
its designs and computational algorithms available
as open source data for other researchers.

The system was able to image photoreceptors as


close as 1.4 degrees eccentric to the fovea area
of the retina, where photoreceptors have an
average spacing of 4.5 microns. Without AO, the
closest measurement had previously been 3.9
degrees. Further clinical trials with the instrument
will follow, and the researchers plan to incorporate
additional imaging modalities into the
platform that could prove useful for detecting
other diseases.
Novel
Components
For Imaging
The use of MEMS tech, has already
allowed miniaturization of many
devices probing visual function

What more?
Diffractive February 2018
Broadband imaging with one planar
Optics for diffractive lens
https://doi.org/10.1080/24699322.2017.1379143
Nabil Mohammad, Monjurul Meem, Bing Shen, Peng Wang & Rajesh
Fundus Menon | Department of Electrical and Computer Engineering, MACOM Technology
Solutions, Department of Medical Engineering, California Institute of Technology

Imaging? Here, we design, fabricate and characterize


broadband diffractive optics as planar lenses
for imaging. Broadband operation is achieved by
optimizing the phase transmission function for each
wavelength carefully to achieve the desired intensity
distribution at that wavelength in the focal plane. Our
approach is able to maintain the quality of the
images comparable to that achievable with
more complex systems of lenses.

The achromatic lenses were patterned on a


photoresist film atop a glass wafer using grayscale
laser patterning using a Heidelberg Instruments
MicroPG101 tool. The exposure dose was varied as a
function of position in order to achieve the multiple (a) Schematic of a flat-lens design. The structure is
height levels dictated by the design.
comprised of concentric rings of width, Wmin and varying
We show here that planar diffractive lenses, when heights. (b) Photograph of one fabricated lens. Optical
designed properly and fabricated carefully are sufficient micrographs of (c) NA = 0.05 and (d) NA = 0.18 lenses.
for broadband imaging. By extending the fabrication Focal length is 1 mm. Measured full-width at half-maximum
process to industry-relevant lithographic scales and (FWHM) of the focal spot as a function of wavelength for
large area replication via nanoimprinting (Guo 2004), it (e) NA = 0.05 and (f) NA = 0.18 lenses. Measured focal
is possible to envision planar lenses enabling imaging spots as a function of wavelength for (g) NA = 0.05 and (h)
with very thin form factors, low weights and low costs.
NA = 0.18 lenses.
Therefore, we believe that our approach will lead to
considerably simpler, thinner and cheaper imaging
systems.
Diffractive July 2018
Fractal-structured multifocal intraocular lens
May 2018
Modification of Fresnel zone light field
https://doi.org/10.1371/journal.pone.0200197
Optics in intraocular Laura Remón, Salvador García-Delpech, Patricia Udaondo, Vicente Ferrando, Juan
A. Monsoriu, Walter D. Furlan | Departamento de Óptica y Optometría y Ciencias de la Visión,
Universitat de València, Burjassot, Spain
spectral imaging system for higher
resolution
lenses (IOL) and satellite- https://doi-org/10.1117/12.2303898
Carlos Diaz; Anthony L. Franz; Jack A. Shepherd
Air Force Institute of Technology (United States)

based remote sensing Recent interest in building an imaging system using


diffractive optics that can fit on a CubeSat (10 cm x 10
cm x 30 cm) and can correct severe chromatic
aberrations inherent to diffractive optics has led to the
development of the Fresnel zone light field
spectral imaging system (FZLFSI). The FZLFSI is a
system that integrates an axial dispersion binary
diffractive optic with a light field (plenoptic)
camera design that enables snapshot spectral
imaging capability.

In this work, we present a new concept of IOL design inspired by the


demonstrated properties of reduced chromatic aberration and
extended depth of focus of Fractal zone plates. A detailed
description of a proof of concept IOL is provided. The result was
numerically characterized, and fabricated by lathe turning. The
prototype was tested in vitro using dedicated optical system and
software. The theoretical Point Spread Function along the optical axis,
computed for several wavelengths, showed that for each wavelength,
the IOL produces two main foci surrounded by numerous secondary
foci that partially overlap each other for different wavelengths. The result
is that both, the near focus and the far focus, have an extended
depth of focus under polychromatic illumination. 
Metamaterial July 2017
Metamaterials and imaging
2018
Dynamically tunable and active hyperbolic
Optics for https://dx.doi.org/10.1186/s40580-015-0053-7
Minkyung Kim and Junsuk Rho
metamaterials
https://doi.org/10.1364/AOP.10.000354
Joseph S. T. Smalley, Felipe Vallini, Xiang Zhang, and Yeshaiahu
Fundus Here, we review metamaterial-based lenses which
offer the new types of imaging components and
Fainman

Imaging? functions. Perfect lens, superlenses, hyperlenses,


metalenses, flat lenses based on metasurfaces, and
Here

non-optical lenses including acoustic hyperlens are


described.

Not all of them offer sub-diffraction imaging, but they


provide new imaging mechanisms by controlling
and manipulating the path of light. The underlying
physics, design principles, recent advances, major
limitations and challenges for the practical
applications are discussed in this review.

Diffraction-free acoustic imaging using metamaterials


allows more efficient underwater sonar sensing,
medical ultra-sound imaging, and non-
destructive materials testing.

Therefore, with the development of


nanofabrication and nanomanufacturing
methods, and the integration of new creative ideas,
overcoming the results and limitations mentioned in
this review will be the continuous efforts to make
metamaterial-based imaging techniques to be a next
generation of imaging technology replacing current
optical microscopy, which thus can be called
nanoscopy.
Metasurface July 2017
Optics with Metasurfaces: Beyond
Refractive and Diffractive Optics
Optics for https://doi.org/10.1364/OFT.2017.OW1B.5
Mohammadreza Khorasaninejad
Fundus Harvard University

Imaging? Flat optics based-on metasurfaces has


the potential to replace/complement
conventional refractive/diffractive
components. In this talk, we give an
overview of our works on dielectric-
metasurfaces, which have led to high
performance components in the visible
spectrum.
Nano-optic endoscope sees deep into tissue at high resolution Now, experts in
endoscopic imaging at Massachusetts General Hospital (MGH) and pioneers of flat metalens technology at the
Harvard John A. Paulson  School of Engineering and Applied Sciences (SEAS), have teamed up to develop a
new class of endoscopic imaging catheters – termed nano-optic endoscopes 
- https://doi.org/10.1038/s41566-018-0224-2
Department of Electrical and Computer Engineering, National University of Singapore,
Singapore, Singapore - Yao-Wei Huang & Cheng-Wei Qiu

https://doi.org/10.1364/OPTICA.4.000139

S-ar putea să vă placă și