WO2015047981A1 - Systems and methods for diagnosing inherited retinal diseases - Google Patents

Systems and methods for diagnosing inherited retinal diseases Download PDF

Info

Publication number
WO2015047981A1
WO2015047981A1 PCT/US2014/056891 US2014056891W WO2015047981A1 WO 2015047981 A1 WO2015047981 A1 WO 2015047981A1 US 2014056891 W US2014056891 W US 2014056891W WO 2015047981 A1 WO2015047981 A1 WO 2015047981A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
patient
computer
generate
clinical
Prior art date
Application number
PCT/US2014/056891
Other languages
French (fr)
Inventor
Kanishka T. JAYASUNDERA
Gail Hohner
Jillian T. HUANG
Naheed W. Khan
Matthew K. Johnson-Roberson
Daniel L. Albertus
Ira Schachar
Sarwar Zahid
Amani Al-Tarouti
Christopher R. Ranella
Zhao HUANG
Andrew M. Lynch
Carla S. Kaspar
Nathan T. Patel
Adnan Tahir
Original Assignee
The Regents Of The University Of Michigan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Regents Of The University Of Michigan filed Critical The Regents Of The University Of Michigan
Publication of WO2015047981A1 publication Critical patent/WO2015047981A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs
    • G06F16/24578Query processing with adaptation to user needs using ranking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5838Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour

Definitions

  • the present disclosure generally relates to a method for diagnosing retinal diseases and, more particularly, to a method for automated diagnosis of inherited retinal disease.
  • the human retina is a thin layer of neural tissue at the back of the eye that transforms light into electrical signals for the brain.
  • the retina can be divided into distinct regions related to their visual function. These distinct regions are: (i) the posterior pole, where the majority of photoreceptor cells (responsible for central, high acuity color vision) lie; and (ii) the periphery, which includes everything outside the posterior pole.
  • the posterior pole includes a non-photosensitive structure known as the optic nerve and the macula. Within the center of the macula is a region known as the fovea, which is responsible for nearly all of our high acuity color vision.
  • the process of diagnosing an inherited retinal disease typically involves recording demographic information about the patient, documenting their ocular and medical history including the age of onset and reported symptoms, evaluating the family history to determine if there is specific genetic mode of transmission that can be distinguished, and performing a clinical evaluation.
  • the clinical evaluation often includes diagnostic testing with a best corrected visual acuity (BCVA) measurement,
  • ERG electroretinography
  • GVF Goldmann visual field
  • interpretation of retinal imaging e.g., optical coherence tomography, fundus autofluorescence, and fluorescein anigiography.
  • sequencing, determining causative mutations from polymorphisms can be guided by an assessment of the patient's phenotype.
  • a computer-implemented method for automatically diagnosing inherited retinal disease comprises receiving, via a network interface, a plurality of dissimilar types of data from an end user device, the plurality of dissimilar types of data being related to retinal disease and the plurality of dissimilar types of data corresponding to a patient, and pre-processing, with one or more processors, at least one of the plurality of dissimilar types of data to generate a feature vector descriptive of the patient.
  • the method includes, for each of the plurality of dissimilar types of data: (i) comparing, with the one or more processors, portions of the respective type of data or a corresponding feature vector to data in a mutation proven database, wherein the data in the mutation proven database is similar to the respective type of data and wherein the data in the mutation proven database corresponds to a plurality of patients with known genetic diagnoses; (ii) generating, with the one or more processors, a ranked list of matches between the patient and the plurality of patients with known genetic diagnoses; and (iii) storing, with the one or more processors, the ranked list of matches in an output database.
  • the method includes aggregating, with the one or more processors, a plurality of ranked lists of matches in the output database to generate a ranked list of genetic diagnoses corresponding to the patient, and sending, via the network interface, an indication of the ranked list of genetic diagnoses to the end user device.
  • a computer device for automatically diagnosing inherited retinal disease comprises one or more processors and one or more memories coupled to the one or more processors, wherein the one or more memories include computer executable instructions stored therein that, when executed by the one or more processors, cause the one or more processors to: receive, via a network interface, a plurality of dissimilar types of data from an end user device, the plurality of dissimilar types of data being related to retinal disease and the plurality of dissimilar types of data corresponding to a patient, and pre-process at least one of the plurality of dissimilar types of data to generate a feature vector descriptive of the patient.
  • the computer executable instructions cause the one or more processors to, for each of the plurality of dissimilar types of data: (i) compare portions of the respective type of data or a corresponding feature vector to data in a mutation proven database, wherein the data in the mutation proven database is similar to the respective type of data and wherein the data in the mutation proven database corresponds to a plurality of patients with known genetic diagnoses; (ii) generate a ranked list of matches between the patient and the plurality of patients with known genetic diagnoses; and (iii) store the ranked list of matches in an output database.
  • the computer executable instructions cause the one or more processors to aggregate a plurality of ranked lists of matches in the output database to generate a ranked list of genetic diagnoses corresponding to the patient and send an indication of the ranked list of genetic diagnoses to the end user device.
  • FIG. 1 illustrates an example computing environment for automated diagnosis of inherited retinal disease.
  • FIG. 2 is a block diagram of an example implementation of the diagnosis server illustrated in Fig. 1 .
  • FIG. 3 is a flow diagram of an example method for automated diagnosis of inherited retinal disease which can be implemented by the diagnosis server illustrated in Fig. 2.
  • Fig. 4 illustrates a screenshot of an example landing page 400 which can be utilized as part of the method of Fig. 3.
  • Fig. 5 illustrates a screenshot of an example detailed patient chart entry form which can be utilized as part of the method of Fig. 3.
  • FIGs. 6A-6D illustrate screenshots of example entry or upload forms which can be accessed via the detailed patient chart entry form illustrated in Fig. 5.
  • Fig. 7 illustrates a screenshot of an example completed patient chart entry form which can be a completed version of the detailed patient chart entry form illustrated in Fig. 5.
  • Fig. 8 illustrates a screenshot of an example labeling interface which can be utilized as part of the method of Fig. 3.
  • Fig. 9 is an example array of hyperfluorescent rings in autofluorescence imaging.
  • Fig. 10 is an example image with hyperfluorescent flecks in autofluorescence imaging.
  • Fig. 1 1 is an example image with hyporfluorescent flecks in autofluorescence imaging.
  • Fig. 8 illustrates a screenshot of an example results page which can be utilized as part of the method of Fig. 3.
  • Fig. 1 illustrates an example computing environment 100 to automatically generate diagnoses of inherited retinal disease.
  • a user of an end user device 102 is communicatively coupled, via one or more wired or wireless interfaces, to a network 104 and a web server 106.
  • the end user device may include any suitable computing device such as a personal computer, smartphone, tablet computer, etc. operated by a clinician, for example.
  • the network 104 may be a proprietary network, a secure public internet, a virtual private network or some other type of network, such as dedicated access lines, plain ordinary telephone lines, satellite links, combinations of these, etc. Where the network 104 comprises the Internet, data communications may take place over the network 104 via an Internet communication protocol.
  • the web server 106 may be implemented in one of several known
  • configurations via one or more servers configured to process web-based traffic received via the network 104 may include load balancing, edge caching, proxy services, authentication services, etc.
  • the end user device 102 is capable of executing a graphical interface (GUI) for a retinal disease diagnosis tool within a web browser application, such as Apple's Safari®, Google AndroidTM mobile web browser, Microsoft Internet Explorer®, etc.
  • GUI graphical interface
  • the web browser application may be implemented as a series of machine-readable instructions for receiving, interpreting, and displaying web page information from the web server 106 while also receiving inputs (e.g., genetics, clinical, or imagery data) from the user.
  • inputs e.g., genetics, clinical, or imagery data
  • a diagnosis server 108 may include a number of software applications responsible for generating retinal disease diagnosis tool content to be included in the web pages sent from the web server 106 to the end user device 102.
  • the diagnosis server 108 may generate data input forms (e.g., for input of clinical data and imagery data about a patient), diagnosis results tables, disease trend indications, etc. as discussed below, to be included in the web pages sent to the end user device 102.
  • the details of an implementation of the diagnosis server 108 are discussed in more detail with reference with Fig. 2.
  • Fig. 1 illustrates one diagnosis server 108, the techniques of the present disclosure may be implemented by any number of servers with any number of processors, such as in a "cloud computing" implementation.
  • the example diagnosis server 108 is operatively connected to a mutation proven database 1 12 and an output database 1 14.
  • additional databases may be operatively connected to the diagnosis server 108 and/or the databases 1 12 and 1 14 may be combined or split into any number of databases or data structures.
  • the mutation proven database 1 12 may be, for example, configured to store data about a plurality of patients with known genetic diagnoses.
  • the data about the plurality of patients may include: (i) a plurality of demographic data 120 such as age, sex, etc. ; (ii) a plurality of clinical data 122 such as ERG values, visual field information, visual acuity information, etc. ; and (iii) a plurality of imagery data 124 such as
  • the mutation proven database 1 12 includes a list of patient identifications (IDs) and corresponding entries (e.g., demographic, clinical, and imagery entries) for each patient ID. It is understood, however, that the mutation proven database 1 12 may include any suitable type of data related to known genetic diagnoses, and the mutation proven database 1 12 may be organized according to any appropriate data structure.
  • the output database 1 14 may be, for example, configured to store results or output generated as part of an automated diagnosis of inherited retinal disease.
  • the results stored in the output database 1 14 may include: (i) genetics output 130 (or "mode of transmission" output) such as generated probabilities corresponding to various genetic forms of transmission and based on family history data about a patient; (ii) clinical output 132 such as clinical feature vectors representative of clinical data (e.g., ERG values) about the patient and/or ranked lists of genetic diagnoses based on clinical data about the patient; and (iii) imagery output 134 such as imagery feature vectors representative of imagery data (e.g., AF images) and/or ranked lists of genetic diagnoses based on imagery data about the patient.
  • genetics output 130 or "mode of transmission" output
  • clinical output 132 such as clinical feature vectors representative of clinical data (e.g., ERG values) about the patient and/or ranked lists of genetic diagnoses based on clinical data about the patient
  • imagery output 134 such as imagery
  • the output database 1 14 may store a final ranked list of genetic diagnosis based all the data about a patient (e.g., including both clinical and imagery data about a patient). The generation, storing, and utilization of genetics, clinical, and imagery output is further discussed with reference to Fig. 3.
  • Fig. 2 illustrates an example diagnosis server 150 that may automatically generate diagnoses of inherited retinal disease and generate content for display on an end user device.
  • the diagnosis server 150 may be implemented as the diagnosis server 108 in the example computing system 100, for example.
  • the diagnosis server 150 may include one or more processing units, 151 , a system memory, 152a and 152b, and a system bus 154 that couples various system components including the system memory 152 to the processing units 151 .
  • the system bus 154 may include an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, a
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • PCI Peripheral Component Interconnect
  • PCI-E Peripheral Component Interconnect Express
  • the diagnosis server 150 may include an assortment of computer-readable media.
  • Computer-readable media may be any media that may be accessed by the diagnosis server 150.
  • the media may include both volatile and nonvolatile media, removable and non-removable media.
  • Media may also include computer storage media and communication media.
  • Computer storage media may include volatile and nonvolatile, removable and non-removable media that stores information such as computer-readable instructions, program modules, data structures, or other data.
  • Computer-storage media may include RAM, ROM, EEPROM, or other memory technology, optical storage disks, magnetic storage devices, and any other medium which may be used to store computer-accessible information.
  • Communication media may be computer-readable instructions, data structures, program modules, or other data in a modulated data signal or other transport mechanism.
  • Communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as RF, infrared, and other wireless media.
  • the system memory may include storage media in the form of volatile and/or non-volatile memory such as ROM 152a and RAM 152b.
  • a basic input/output system (BIOS) containing algorithms to transfer information between components within the computer 150, may be stored in ROM 152b.
  • Data or program modules that are immediately accessible or are presently in use by the processing units 151 may be stored in RAM 152a.
  • Data normally stored in RAM 152a while the diagnosis server 150 is in operation may include an operating system, application programs, program modules, and program data.
  • the RAM 152a may store a retinal disease diagnosis application 160 including an image routine 162, a genetics routine 164, a clinical routine 166, and a diagnosis routine 168, for example.
  • the diagnosis server 150 may also include other storage media such as a hard disk drive that may read from or write to non-removable, non-volatile magnetic media, a magnetic disk drive that reads from or writes to a removable, non-volatile magnetic disk, and an optical disk drive that reads from or writes to a removable, nonvolatile optical disk.
  • Other storage media that may be used includes magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, and solid state ROM.
  • the hard disk drive may be connected to the system bus 154 through a non-removable memory interface such as interface 174.
  • a magnetic disk drive and optical disk drive may be connected to the system bus 154 by a removable memory interface, such as interface 190.
  • a user may interact with the diagnosis server 150 through input devices such as a keyboard or a pointing device (i.e., a mouse).
  • a user input interface 202 may be coupled to the system bus 154 to allow the input devices to communicate with the processing units 151 .
  • a display device 222 such as a monitor, may also be connected to the system bus 154 via a video interface (not shown).
  • the diagnosis server 150 may operate in a networked environment using logical connections to one or more remote computing devices, such as end user device 102 or web server 106, for example.
  • the remote computing device may be a personal computer (PC), a server, a router, or other common network node.
  • the remote computing device typically includes many or all of the previously-described elements regarding the diagnosis server 150, even though such elements are not illustrated in the remote computing devices of Fig. 1 .
  • Logical connections between the diagnosis server 150 and one or more remote computing devices may include a wide area network (WAN).
  • WAN wide area network
  • a typical WAN is the Internet.
  • the diagnosis server 150 may include a modem or other means for establishing communications over the WAN .
  • the modem may be connected to the system bus 154 via the network interface 225, or other mechanism.
  • program modules depicted relative to the diagnosis server 150 may be stored in the remote memory storage device.
  • other means of establishing a communications link between the computer 150 and a remote computing device may be used.
  • Fig. 3 is a flow diagram of an example method 300 for automatically diagnosing inherited retinal disease.
  • the method 300 may be implemented by the diagnosis server 108, for example.
  • example screenshots, illustrated in Figs. 4, 5, 6A-6D, 7, 8, and 12, are referred to below.
  • a diagnosis server may utilize any suitable web browser or application based content and interaction to facilitate the automatic diagnosis of inherited retinal disease.
  • elements of Fig. 1 are referred to in the description of the method 300, but, of course, the method 300 may be implemented in any suitable computing environment.
  • patient data is received from an end user device (block 302).
  • a clinician may operate the end user device 102 to input patient data via a web browser application and subsequently send the patient data to the diagnosis server 108.
  • the patient data may include multiple dissimilar types of data related to inherited retinal disease.
  • the patient data may include: (i) family history data, such as number and gender of children, number of brothers and sisters, number of paternal and maternal aunts and uncles, parents, and grandparents, who, if any, is affected by disease, the age of onset of disease in each individual, and whether or not there is consanguinity between the patient's parents; (ii) demographic data, such as age, sex, ethnicity, etc.
  • a clinician may input, and a diagnosis server may receive, any number and/or combination of types of data related to inherited retinal disease.
  • Fig. 4 illustrates a screenshot of an example landing page 400, accessed via a web browser application, in which a clinician may begin to input patient data and subsequently send the patient data to a diagnosis server for evaluation.
  • the landing page 400 may include multiple form fields in which a clinician may input information about a patient.
  • the clinician may input a "Nickname” (e.g., "New Patient") for the patient into a nickname form field 402 and sex and age information into one or more demographics form fields 404.
  • the clinician may send entered patient data to the diagnosis server 108 by selecting (e.g., via a click or tap) the add patient button 406.
  • selection of the add patient button 406 will also register the patient (e.g., under the nickname New Patient”) in a list of patients (not shown), such that corresponding patient data may be later accessed by the clinician.
  • the clinician may be presented with a detailed patient chart entry form 500, as illustrated in Fig. 5.
  • the clinician may use the detailed patient chart entry form 500 to enter or upload a variety of dissimilar types of patient data (BCVA, ERG, etc.) and subsequently send the entered or uploaded patient data to the diagnosis server 108.
  • the patient chart entry form 500 may include a visual field entry section 502, a BCVA results entry section 504, an ERG results entry section 506, an AF images upload section 508, and a color images upload section 510.
  • each of the sections 502, 504, 506, 508, and 510 may have a corresponding selectable button (buttons 512, 514, 516, 518, and 520, respectively) which the clinician may select to enter patient data corresponding to each data type (visual field, BCVA, ERG, AF imagery, and color imagery).
  • buttons 512, 514, 516, 518, and 520 respectively
  • the clinician may select to enter patient data corresponding to each data type (visual field, BCVA, ERG, AF imagery, and color imagery).
  • Figs. 6A-6D illustrate screenshots of example entry forms accessible via selecting one of the buttons 512, 514, 516, 518, and 520.
  • a web browsing application may present a clinician with a visual field entry form 600 in response to the selection of the button 512, a BCVA entry form 610 in response to the selection of the button 514, an ERG entry form 620 in response to the selection of the button 516, and an AF image upload form 630 in response to selection of button 518.
  • the example entry forms 600, 610, and 620 and the example upload form 630 may include any suitable interfaces for the entry or upload of patient data, such as numerical/text/date entry boxes, radio button, buttons triggering file browsing interfaces, etc.
  • a clinician may select one of the respective done editing buttons 640, 650, 660, or 670 to send the entered or uploaded data (e.g., AF images) to the diagnosis server 108. Further, upon sending patient data to the diagnosis server 108, the clinician may be presented with or may have access to a summary of the entered patient data, such as the summary page 700 illustrated in Fig. 7.
  • At least some of the entered or uploaded patient data is pre-processed (block 304).
  • the pre-processing of (A) imagery data, (B) family history data, and (C) clinical data is discussed below.
  • any type of data related to inherited retinal disease may be pre- processed to transform the data into a convention form.
  • the diagnosis server 108 may execute the image routine 162 to automatically identify, quantify, and categorize pathological markers of retinal disease based on uploaded images. Further, the diagnosis server 108 may execute the image routine 162 to transform the quantifications and categorizations of pathological markers into one or more feature vectors.
  • the diagnosis server 108 uses image processing methods to ensure that uploaded images are directly comparable with one another, with respect to quality, contrast, and gain. Such a process may involve applying a non-linear transformation to some or all of the pixels of an image to compensate for differences in imaging sensor positioning, and illumination conditions. Then, the diagnosis server 108 may apply an additional non-linear function in an attempt to normalize the input images with respect to one another. For example, the image routine 162, when executed by the diagnosis server 108 may use segmentation or histogram shifting in an attempt to normalize a set of uploaded images.
  • Image pre-processing may also involve determining regions in each image to evaluate based on the locations of the fovea and optic disc.
  • the diagnosis server 108 may prompt a user (e.g., a clinician) to label the location of multiple structures in the eye on a representative uploaded image via a browser-based label interface.
  • Fig. 8, for example illustrates a screenshot of a labeling interface 800 displayed within a web browser.
  • a clinician may select (e.g., via a click or tap) a location of the fovea and optic disc. Once selected, the clinician may click on the update points button 802 to send indications of the selected locations to the diagnosis server 108.
  • the example image routine 162 may use indicated locations of the fovea and optic disc to determine the location of the macula, in an implementation. Subsequently, the image routine 162 may utilize the location of the macula, and hence regions outside the macula, in the application of certain filters. Although input of the fovea and optic disc locations via clinician interaction is discussed above, it should be noted that an image routine may use computer vision techniques to automatically detect the location of the fovea and optic disc. Even if a clinician manually indicates the location of either the fovea and/or the optic disc, the image routine 162 may still execute an automated technique to refine and validate such input.
  • image pre-processing may include an analysis on four distinct features: (i) the presence or absence of discrete hypofluorescent marks outside the macula; (ii) the presence or absence of hypofluorescence encompassing the macula; (iii) the presence or absence of hyperfluorescent marks outside the macula; and (iv) the presence or absence of a hyperfluorescent ring surrounding the macula (see example ring images included in Fig. 9).
  • Such an analysis may include an objective quantification of these features such as the quantification discussed in U.S. Provisional Application No.
  • 61 /858915 entitled "Automated measurement of changes in retinal, retinal pigment epithelial, or choroidal disease” and filed on July 26, 2013, the entire disclosure of which is hereby incorporated by reference herein.
  • Other methods of object detection may include the analysis of image texture, in some implementations. This analysis of image texture may involve a convolution of images with a sliding window detector, where the sliding window detector incorporates functions that respond uniquely to different spatial frequencies of intensity variation, for example .
  • the image routine 162 may utilize edge detection methods to delineate detected objects represented by consistent areas of a specific texture. Such edge detection methods may utilize established algorithms such as an Active contour model (snakes), Curvelet transforms, or Gabor wavelets, for example.
  • image pre-processing may include the extraction of other clinically relevant features.
  • hyperfluorescent flecks and hypofluorescent flecks see Fig. 10 for an example image with hyperfluorescent flecks and see Fig. 1 1 for an example image with hypofluorescent flecks.
  • the image routine 162 may extract these features using a combination of object detection methods to identify blobs, flecks, drusen deposits, and any other additional clinically related features.
  • the image routine 162 may generate one or more feature vectors representing the imagery data about the patient.
  • the feature vectors may include indications of the total area of
  • the image routine 162 may transform image features to discrete Boolean values (e.g., via hyperfluorescent or hyperfluorescent indications compared, or paired, one or more threshold values).
  • the image routine 162 may store image feature vectors, representative of the imagery data corresponding to the patient, in a database, such as the output database 1 14. In this manner, the image routine 162 or other routines of the diagnosis server 108 may access the image feature vectors later in the flow of method 300. [0051] (B) With regard to uploaded family history data corresponding to a patient, the diagnosis server 108 may execute the genetics routine 164 to generate probabilities corresponding to genetic modes of transmission.
  • the genetics routine 164 when executed by the diagnosis server 108, compares provided family history data to stored rules (e.g., stored in the memory 152a) regarding a variety of modes of transmission. As part of the comparison, the genetics routine 164 may compute a probability of each genetic mode of transmission.
  • modes of transmission which may be assessed by the genetics routine 164 include autosomal dominant transmission, autosomal recessive transmission, X-linked recessive transmission, and a simplex case, and the
  • corresponding rules may be based on clinically accepted probabilities or averages.
  • the genetic routine 164 may generate a numeric probability score corresponding to the transmission of a particular genetic disorder.
  • the genetic routine 164 may store numeric probabilities in a database, such as the output database 1 14.
  • the clinical routine 166 when executed by the diagnosis server 108, may generate a clinical feature vector representative of the patient whose patient data has been received by the diagnosis server 108.
  • clinical data received at block 302 includes a plurality of clinical parameters, such as ERG parameters (or measurements), visual acuity parameters, and visual field parameters. Based on these parameters, the clinical routine 166 may calculate bounded continuous values and/or discrete Boolean values.
  • ERG parameters are natively a continuous variable, and, therefore, the clinical routine 166 may only normalization the numeric ERG parameters to correct for age and equipment variability.
  • Visual acuity (or BCVA) has corresponding descriptive (e.g., 20/50), rather than numeric, parameters.
  • the clinical routine 166 may translate visual acuity parameters to a known continuous scale, such as the Logarithm of Minimum Angle of Resolution (logMAR) scale.
  • Visual field is a feature based description (central scotoma, enlarged optic nerve scotoma, decreased range, etc.). To translate this feature based description, the clinical routine 166 may generate a Boolean representation.
  • the Boolean vector of ⁇ True, False, True, ... ⁇ may represent features that are present in a visual field such as peripheral visual field constriction (e.g., True in this case), central scotoma at the macula (e.g., False in this case), enlarged scotoma at the optic nerve (e.g., True in this case), etc.
  • a Boolean representation will allow for later comparison to data in the mutation proven database 1 12, which may have patient information that has been tagged in an identical manner, in an implementation. For example, a routine may compare two given clinical feature vectors with paired True-False values at the same position value in each vector. Such a comparison is further discussed with reference to block 308.
  • clinical feature vectors and modes of genetic transmission may be combined to create a single feature vector that is representative of the patient whose clinical data has been input into the system, in an implementation.
  • the clinical routine 166 may utilize numeric probabilities of genetics modes of transmission (e.g., generated from family history data) to weight and combine various feature vectors generated from clinical data (e.g., a BCVA feature vector, a visual field feature vector, and an ERG feature vector).
  • the clinical routine 166 may generate one or more numeric or Boolean based feature vectors representing the patient's clinical data.
  • the clinical routine 166 may store continuous/Boolean vectors in a database, such as the output database 1 14.
  • representations of clinical, demographic, and imagery data are compared with data in the mutation proven database 1 12.
  • the diagnosis routine 168 (executed by the diagnosis server 108) may query the mutation proven database 1 12 against a numerical clinical feature vector, or, in the case of a Boolean feature vector, the diagnosis routine 168 may compare two given feature vectors whose True-False values are paired.
  • the diagnosis routine 168 may compare image feature vectors showing specific disease markers to records in the mutation proven database 1 12 corresponding to images tagged with disease markers.
  • ranked lists of mutation proven patients are generated (block 308).
  • a comparison of a clinical feature vector with data in the mutation proven database 1 12 may return a ranked list of "nearest neighbors" to the clinical feature vector.
  • the diagnosis routine 168 may execute an approximate nearest neighbor search in an arbitrarily high dimensional space, or other suitable machine learning routine, to identify the entries (i.e., known patients with proven genetic diagnoses) in the mutation proven database 1 12 that have similarities with the clinical/imagery data about the patient and corresponding distances between the patient and the entries in the mutation proven database 1 12.
  • ranked lists of matches in the mutation proven database may be ordered lists, whereas, in other implementation, ranked lists of matches may be sorted according to corresponding probabilities.
  • the diagnosis routine 168 may generate one or more ranked lists of matches in the mutation proven database 1 12 for each of the types of data received at block 302. For example, a ranked list of matches may be generated for each of an imagery feature vector, a clinical feature vector, and a plurality of demographic information.
  • the ranked lists of matches may be stored in corresponding data structures of the output database 1 14, such as in the clinical output 132 and the imagery output 134 (block 310)
  • multiple ranked lists of matches to proven genetic diagnoses are weighted and combined to produce a final ranked listed of genetic diagnoses (block 312).
  • the weights used to combine multiple ranked lists into a final ranked list of genetic diagnoses may be at least partially based on human input (e.g., via weighting of parameters) from clinicians with expertise in inherited retinal diseases.
  • the weights used to generate the final ranked list of genetic diagnoses may be refined over time based on further accumulated data in the mutation proven database 1 12 and/or based on further learning of algorithms. For example, expert clinicians may initialize the diagnosis routine 168 with certain weights based on clinical experience. Then, as the diagnosis routine 168 is verified, the weights of the diagnosis routine 168 may be refined.
  • the diagnosis routine 168 may use any suitable supervised or unsupervised machine learning technique to update weights in light of verification results and/or newly accumulated data.
  • the method 300 may accommodate variable degrees of input data. If, for example, visual field data is unavailable, the diagnosis routine 168 may still calculate the most likely genetic diagnoses based on other available information, such as BCVA or ERG data. In general, the method 300 may utilize any amount and combination of dissimilar types of data related to inherited retinal disease.
  • the diagnosis routine 168 may generate the final ranked list of genetic diagnoses as an ordered list of genetic diagnoses, or the diagnosis routine 168 may calculate a numeric probability (e.g., 90%) corresponding to each genetic diagnosis in the list of genetic diagnoses.
  • a numeric probability e.g. 90%
  • the algorithms employed by the diagnosis routine 168 to generate such probabilities may include machine learning algorithms that may be refined after verification and the accumulation of additional data.
  • the final ranked list of likely genetic diagnoses is sent to the end user device, operated by the clinician, to provide an indication of likely causative genes and/or gene mutations corresponding to the patient (block 312).
  • the ranked list of genetic diagnoses may be sent to the end user device as a table of genes and corresponding probabilities ordered according to descending probability.
  • a clinician may utilize such a ranked list of genetic diagnoses as a guide for genetic screening, such as prioritizing genes for consideration in whole exome sequencing or whole genome sequencing.
  • the method 300 may aid clinicians in rapidly and efficiently diagnosing a patient's retinal condition and guiding genetic testing strategies.
  • a clinician operating the end user device 102 may both rapidly exploit a variety of dissimilar types of data and reduce the logistical complexity of having to screen a large number of genes.
  • the method 300 may provide a clinician sufficient automated capabilities to consider a plurality of likely retinal diseases, even in the absence of specialized medical training in inherited retinal diseases.
  • Fig. 12 illustrates a screenshot of an example results page 1200 that may be presented to a clinician (e.g., within a web browser) after a ranked list of genetic diagnoses is sent to the end user device 102.
  • the results page includes a brief summary 1202 of information about the patient and a table 1204 of gene-probability pairs.
  • the genes are represented by a corresponding gene name (RPGR, RP2, etc.), and the table 1204 may only include genes with a probability, or likelihood, of 1 % or greater.
  • a ranked list of genes may be presented to a clinician in any suitable representation, such as a pie chart, graph, bar chart, list, etc.
  • At least some of the diagnoses in a ranked list of genetic diagnoses presented to a clinician may be selectable by the clinician. In this manner, the clinician may view further information about the diagnosis.
  • the diagnosis server 108 may generate diagnosis summaries
  • diagnosis server 108 may generate a diagnosis summary with trends indicating implications of a particular diagnosis on patients matching the demographics of the patient whose information is under consideration.
  • diagnosis summary may include any representative information about a diagnosis, such as medical book excerpts, parameter trends (e.g., ERG and BCVA trends), exemplary images, etc.

Abstract

A method for automatically diagnosing inherited retinal disease includes receiving a plurality of dissimilar types of data and pre-processing at least one of the plurality of dissimilar types of data to generate a feature vector descriptive of a patient. Further, the method includes, for each of the plurality of dissimilar types of data: (i) comparing portions of the respective type of data or a corresponding feature vector to data in a mutation proven database; (ii) generating a ranked list of matches between the patient and the plurality of patients with known diagnoses; and (iii) storing the ranked list of matches in an output database. A diagnosis routine then aggregates a plurality of ranked lists of matches in the output database to generate a ranked list of genetic diagnoses corresponding to the patient and sends an indication of the ranked list of genetic diagnoses to the end user device.

Description

SYSTEMS AND METHODS FOR DIAGNOSING INHERITED RETINAL DISEASES
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional Application No.
61 /881 ,71 1 , entitled "Systems and Methods for Diagnosing Inherited Retinal Diseases" which was filed on September 24, 2013, the disclosure of which is hereby incorporated herein by reference in its entirety for all purposes.
TECHNICAL FIELD
[0002] The present disclosure generally relates to a method for diagnosing retinal diseases and, more particularly, to a method for automated diagnosis of inherited retinal disease.
BACKGROUND
[0003] The human retina is a thin layer of neural tissue at the back of the eye that transforms light into electrical signals for the brain. The retina can be divided into distinct regions related to their visual function. These distinct regions are: (i) the posterior pole, where the majority of photoreceptor cells (responsible for central, high acuity color vision) lie; and (ii) the periphery, which includes everything outside the posterior pole. In particular, the posterior pole includes a non-photosensitive structure known as the optic nerve and the macula. Within the center of the macula is a region known as the fovea, which is responsible for nearly all of our high acuity color vision.
[0004] The process of diagnosing an inherited retinal disease typically involves recording demographic information about the patient, documenting their ocular and medical history including the age of onset and reported symptoms, evaluating the family history to determine if there is specific genetic mode of transmission that can be distinguished, and performing a clinical evaluation. The clinical evaluation often includes diagnostic testing with a best corrected visual acuity (BCVA) measurement,
electroretinography (ERG) measurement, a Goldmann visual field (GVF) measurement, and interpretation of retinal imaging (e.g., optical coherence tomography, fundus autofluorescence, and fluorescein anigiography). [0005] Although the diagnostic process is simple in principle, the task is, in practice, an art. A clinician bases a diagnosis on previous experience and knowledge of the existing literature, and, as a result, only a handful of specialists in the United States can effectively diagnose and manage patients with inherited retinal conditions. Moreover, several factors complicate the diagnostic process, because inherited retinal conditions are clinically and genetically heterogeneous. First, variability can arise in the clinical presentation of patients with the same condition. Second, findings between conditions can have considerable overlap, and a clinician must understand which findings are more useful in making the correct diagnosis. Still further, even if the clinician does have a clear understanding of clinical findings and the likely diagnosis, the clinician must be able to use available information to order appropriate genetic testing for molecular confirmation of the diagnosis. This requires additional proficiency in the complicated field of genetics, because multiple genes can cause the same condition.
[0006] Confirming a clinical diagnosis with genetic testing is important because such a confirmation can play a vital role in the management and preservation of vision in patients by aiding in the determination of appropriate therapeutic interventions.
Currently, however, cost conscious health management organizations demand that clinicians employ the most cost-effective means for diagnosing a patient's condition. The cost of screening a single gene can be anywhere from several hundred to several thousands of dollars. It can, therefore, be cost prohibitive to pursue genetic testing for all genes associated with a particular condition. Also, screening large numbers of genes introduces logistical complexity in the process of diagnosis. Even when multiple genes can be tested at the same time, as in a panel of genes or with whole genome
sequencing, determining causative mutations from polymorphisms can be guided by an assessment of the patient's phenotype.
SUMMARY
[0007] In one embodiment, a computer-implemented method for automatically diagnosing inherited retinal disease comprises receiving, via a network interface, a plurality of dissimilar types of data from an end user device, the plurality of dissimilar types of data being related to retinal disease and the plurality of dissimilar types of data corresponding to a patient, and pre-processing, with one or more processors, at least one of the plurality of dissimilar types of data to generate a feature vector descriptive of the patient. Further, the method includes, for each of the plurality of dissimilar types of data: (i) comparing, with the one or more processors, portions of the respective type of data or a corresponding feature vector to data in a mutation proven database, wherein the data in the mutation proven database is similar to the respective type of data and wherein the data in the mutation proven database corresponds to a plurality of patients with known genetic diagnoses; (ii) generating, with the one or more processors, a ranked list of matches between the patient and the plurality of patients with known genetic diagnoses; and (iii) storing, with the one or more processors, the ranked list of matches in an output database. Still further, the method includes aggregating, with the one or more processors, a plurality of ranked lists of matches in the output database to generate a ranked list of genetic diagnoses corresponding to the patient, and sending, via the network interface, an indication of the ranked list of genetic diagnoses to the end user device.
[0008] In another embodiment, a computer device for automatically diagnosing inherited retinal disease comprises one or more processors and one or more memories coupled to the one or more processors, wherein the one or more memories include computer executable instructions stored therein that, when executed by the one or more processors, cause the one or more processors to: receive, via a network interface, a plurality of dissimilar types of data from an end user device, the plurality of dissimilar types of data being related to retinal disease and the plurality of dissimilar types of data corresponding to a patient, and pre-process at least one of the plurality of dissimilar types of data to generate a feature vector descriptive of the patient. Further, the computer executable instructions cause the one or more processors to, for each of the plurality of dissimilar types of data: (i) compare portions of the respective type of data or a corresponding feature vector to data in a mutation proven database, wherein the data in the mutation proven database is similar to the respective type of data and wherein the data in the mutation proven database corresponds to a plurality of patients with known genetic diagnoses; (ii) generate a ranked list of matches between the patient and the plurality of patients with known genetic diagnoses; and (iii) store the ranked list of matches in an output database. Still further, the computer executable instructions cause the one or more processors to aggregate a plurality of ranked lists of matches in the output database to generate a ranked list of genetic diagnoses corresponding to the patient and send an indication of the ranked list of genetic diagnoses to the end user device.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] Fig. 1 illustrates an example computing environment for automated diagnosis of inherited retinal disease.
[0010] Fig. 2 is a block diagram of an example implementation of the diagnosis server illustrated in Fig. 1 .
[0011] Fig. 3 is a flow diagram of an example method for automated diagnosis of inherited retinal disease which can be implemented by the diagnosis server illustrated in Fig. 2.
[0012] Fig. 4 illustrates a screenshot of an example landing page 400 which can be utilized as part of the method of Fig. 3.
[0013] Fig. 5 illustrates a screenshot of an example detailed patient chart entry form which can be utilized as part of the method of Fig. 3.
[0014] Figs. 6A-6D illustrate screenshots of example entry or upload forms which can be accessed via the detailed patient chart entry form illustrated in Fig. 5.
[0015] Fig. 7 illustrates a screenshot of an example completed patient chart entry form which can be a completed version of the detailed patient chart entry form illustrated in Fig. 5.
[0016] Fig. 8 illustrates a screenshot of an example labeling interface which can be utilized as part of the method of Fig. 3.
[0017] Fig. 9 is an example array of hyperfluorescent rings in autofluorescence imaging.
[0018] Fig. 10 is an example image with hyperfluorescent flecks in autofluorescence imaging. [0019] Fig. 1 1 is an example image with hyporfluorescent flecks in autofluorescence imaging.
[0020] Fig. 8 illustrates a screenshot of an example results page which can be utilized as part of the method of Fig. 3.
DETAILED DESCRIPTION
[0021] Although the following text sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the description is defined by the words of the claims set forth at the end of this disclosure. The detailed description is to be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical, if not impossible. Numerous alternative embodiments could be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims.
[0022] It should also be understood that, unless a term is expressly defined in this patent using the sentence "As used herein, the term ' ' is hereby defined to mean..." or a similar sentence, there is no intent to limit the meaning of that term, either expressly or by implication, beyond its plain or ordinary meaning, and such terms should not be interpreted to be limited in scope based on any statement made in any section of this patent (other than the language of the claims). To the extent that any term recited in the claims at the end of this patent is referred to in this patent in a manner consistent with a single meaning, that is done for the sake of clarity only so as to not confuse the reader, and it is not intended that such claim term be limited, by implication or otherwise, to that single meaning. Finally, unless a claim element is defined by reciting the word "means" and a function without the recital of any structure, it is not intended that the scope of any claim element be interpreted based on the application of 35 U.S.C. § 1 12, sixth paragraph.
System Overview
[0023] As used herein, the term "fundus" is hereby defined to mean the surface of an organ opposite its opening. A fundus may be a human retina, for example. [0024] Fig. 1 illustrates an example computing environment 100 to automatically generate diagnoses of inherited retinal disease. A user of an end user device 102 is communicatively coupled, via one or more wired or wireless interfaces, to a network 104 and a web server 106. The end user device may include any suitable computing device such as a personal computer, smartphone, tablet computer, etc. operated by a clinician, for example. The network 104 may be a proprietary network, a secure public internet, a virtual private network or some other type of network, such as dedicated access lines, plain ordinary telephone lines, satellite links, combinations of these, etc. Where the network 104 comprises the Internet, data communications may take place over the network 104 via an Internet communication protocol.
[0025] The web server 106 may be implemented in one of several known
configurations via one or more servers configured to process web-based traffic received via the network 104 and may include load balancing, edge caching, proxy services, authentication services, etc.
[0026] In an implementation, the end user device 102 is capable of executing a graphical interface (GUI) for a retinal disease diagnosis tool within a web browser application, such as Apple's Safari®, Google Android™ mobile web browser, Microsoft Internet Explorer®, etc. The web browser application may be implemented as a series of machine-readable instructions for receiving, interpreting, and displaying web page information from the web server 106 while also receiving inputs (e.g., genetics, clinical, or imagery data) from the user. Further, those skilled in the art will recognize that the present system may be utilized in a dedicated application in addition to a web browser.
[0027] A diagnosis server 108 may include a number of software applications responsible for generating retinal disease diagnosis tool content to be included in the web pages sent from the web server 106 to the end user device 102. For example, the diagnosis server 108 may generate data input forms (e.g., for input of clinical data and imagery data about a patient), diagnosis results tables, disease trend indications, etc. as discussed below, to be included in the web pages sent to the end user device 102. The details of an implementation of the diagnosis server 108 are discussed in more detail with reference with Fig. 2. Further, although Fig. 1 illustrates one diagnosis server 108, the techniques of the present disclosure may be implemented by any number of servers with any number of processors, such as in a "cloud computing" implementation.
[0028] The example diagnosis server 108 is operatively connected to a mutation proven database 1 12 and an output database 1 14. However, it should be noted that, while not shown, additional databases may be operatively connected to the diagnosis server 108 and/or the databases 1 12 and 1 14 may be combined or split into any number of databases or data structures.
[0029] The mutation proven database 1 12 may be, for example, configured to store data about a plurality of patients with known genetic diagnoses. The data about the plurality of patients may include: (i) a plurality of demographic data 120 such as age, sex, etc. ; (ii) a plurality of clinical data 122 such as ERG values, visual field information, visual acuity information, etc. ; and (iii) a plurality of imagery data 124 such as
autofluorescence (AF), Optical Coherence Tomography (OCT), or color images in any suitable format (TIF, JPG, PNG, etc.). In one implementation, the mutation proven database 1 12 includes a list of patient identifications (IDs) and corresponding entries (e.g., demographic, clinical, and imagery entries) for each patient ID. It is understood, however, that the mutation proven database 1 12 may include any suitable type of data related to known genetic diagnoses, and the mutation proven database 1 12 may be organized according to any appropriate data structure.
[0030] The output database 1 14 may be, for example, configured to store results or output generated as part of an automated diagnosis of inherited retinal disease. The results stored in the output database 1 14 may include: (i) genetics output 130 (or "mode of transmission" output) such as generated probabilities corresponding to various genetic forms of transmission and based on family history data about a patient; (ii) clinical output 132 such as clinical feature vectors representative of clinical data (e.g., ERG values) about the patient and/or ranked lists of genetic diagnoses based on clinical data about the patient; and (iii) imagery output 134 such as imagery feature vectors representative of imagery data (e.g., AF images) and/or ranked lists of genetic diagnoses based on imagery data about the patient. Further, the output database 1 14 may store a final ranked list of genetic diagnosis based all the data about a patient (e.g., including both clinical and imagery data about a patient). The generation, storing, and utilization of genetics, clinical, and imagery output is further discussed with reference to Fig. 3.
[0031] Fig. 2 illustrates an example diagnosis server 150 that may automatically generate diagnoses of inherited retinal disease and generate content for display on an end user device. The diagnosis server 150 may be implemented as the diagnosis server 108 in the example computing system 100, for example. The diagnosis server 150 may include one or more processing units, 151 , a system memory, 152a and 152b, and a system bus 154 that couples various system components including the system memory 152 to the processing units 151 . The system bus 154 may include an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, a
Peripheral Component Interconnect (PCI) bus or a Mezzanine bus, and the Peripheral Component Interconnect Express (PCI-E) bus.
[0032] The diagnosis server 150 may include an assortment of computer-readable media. Computer-readable media may be any media that may be accessed by the diagnosis server 150. By way of example, and not limitation, the media may include both volatile and nonvolatile media, removable and non-removable media. Media may also include computer storage media and communication media. Computer storage media may include volatile and nonvolatile, removable and non-removable media that stores information such as computer-readable instructions, program modules, data structures, or other data. Computer-storage media may include RAM, ROM, EEPROM, or other memory technology, optical storage disks, magnetic storage devices, and any other medium which may be used to store computer-accessible information.
Communication media may be computer-readable instructions, data structures, program modules, or other data in a modulated data signal or other transport mechanism.
Communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as RF, infrared, and other wireless media. [0033] The system memory may include storage media in the form of volatile and/or non-volatile memory such as ROM 152a and RAM 152b. A basic input/output system (BIOS), containing algorithms to transfer information between components within the computer 150, may be stored in ROM 152b. Data or program modules that are immediately accessible or are presently in use by the processing units 151 may be stored in RAM 152a. Data normally stored in RAM 152a while the diagnosis server 150 is in operation may include an operating system, application programs, program modules, and program data. In particular, the RAM 152a may store a retinal disease diagnosis application 160 including an image routine 162, a genetics routine 164, a clinical routine 166, and a diagnosis routine 168, for example.
[0034] The diagnosis server 150 may also include other storage media such as a hard disk drive that may read from or write to non-removable, non-volatile magnetic media, a magnetic disk drive that reads from or writes to a removable, non-volatile magnetic disk, and an optical disk drive that reads from or writes to a removable, nonvolatile optical disk. Other storage media that may be used includes magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, and solid state ROM. The hard disk drive may be connected to the system bus 154 through a non-removable memory interface such as interface 174. A magnetic disk drive and optical disk drive may be connected to the system bus 154 by a removable memory interface, such as interface 190.
[0035] A user may interact with the diagnosis server 150 through input devices such as a keyboard or a pointing device (i.e., a mouse). A user input interface 202 may be coupled to the system bus 154 to allow the input devices to communicate with the processing units 151 . A display device 222, such as a monitor, may also be connected to the system bus 154 via a video interface (not shown).
[0036] The diagnosis server 150 may operate in a networked environment using logical connections to one or more remote computing devices, such as end user device 102 or web server 106, for example. The remote computing device may be a personal computer (PC), a server, a router, or other common network node. The remote computing device typically includes many or all of the previously-described elements regarding the diagnosis server 150, even though such elements are not illustrated in the remote computing devices of Fig. 1 . Logical connections between the diagnosis server 150 and one or more remote computing devices may include a wide area network (WAN). A typical WAN is the Internet. When used in a WAN, the diagnosis server 150 may include a modem or other means for establishing communications over the WAN . The modem may be connected to the system bus 154 via the network interface 225, or other mechanism. In a networked environment, program modules depicted relative to the diagnosis server 150, may be stored in the remote memory storage device. As may be appreciated, other means of establishing a communications link between the computer 150 and a remote computing device may be used.
Automated Diagnosis of Inherited Retinal Disease
[0037] Fig. 3 is a flow diagram of an example method 300 for automatically diagnosing inherited retinal disease. The method 300 may be implemented by the diagnosis server 108, for example. To aid in the description of the method 300, example screenshots, illustrated in Figs. 4, 5, 6A-6D, 7, 8, and 12, are referred to below. However, it is understood that a diagnosis server may utilize any suitable web browser or application based content and interaction to facilitate the automatic diagnosis of inherited retinal disease. Further, elements of Fig. 1 are referred to in the description of the method 300, but, of course, the method 300 may be implemented in any suitable computing environment.
[0038] To begin, patient data is received from an end user device (block 302). For example, a clinician may operate the end user device 102 to input patient data via a web browser application and subsequently send the patient data to the diagnosis server 108. The patient data may include multiple dissimilar types of data related to inherited retinal disease. By way of example and without limitation, the patient data may include: (i) family history data, such as number and gender of children, number of brothers and sisters, number of paternal and maternal aunts and uncles, parents, and grandparents, who, if any, is affected by disease, the age of onset of disease in each individual, and whether or not there is consanguinity between the patient's parents; (ii) demographic data, such as age, sex, ethnicity, etc. ; (iii) clinical data such as Best Corrected Visual Acuity (BCVA), visual field, ERG measurements, and electrooculography (EOG) measurements; and (iv) imagery data, such as AF, color, and OCT images of the retina. In general, a clinician may input, and a diagnosis server may receive, any number and/or combination of types of data related to inherited retinal disease.
[0039] Fig. 4 illustrates a screenshot of an example landing page 400, accessed via a web browser application, in which a clinician may begin to input patient data and subsequently send the patient data to a diagnosis server for evaluation. The landing page 400 may include multiple form fields in which a clinician may input information about a patient. For example, the clinician may input a "Nickname" (e.g., "New Patient") for the patient into a nickname form field 402 and sex and age information into one or more demographics form fields 404. The clinician may send entered patient data to the diagnosis server 108 by selecting (e.g., via a click or tap) the add patient button 406. In some implementations, selection of the add patient button 406 will also register the patient (e.g., under the nickname New Patient") in a list of patients (not shown), such that corresponding patient data may be later accessed by the clinician.
[0040] Alternatively, or in response to selection of the add patient button 406, the clinician may be presented with a detailed patient chart entry form 500, as illustrated in Fig. 5. The clinician may use the detailed patient chart entry form 500 to enter or upload a variety of dissimilar types of patient data (BCVA, ERG, etc.) and subsequently send the entered or uploaded patient data to the diagnosis server 108. For example, the patient chart entry form 500 may include a visual field entry section 502, a BCVA results entry section 504, an ERG results entry section 506, an AF images upload section 508, and a color images upload section 510. In addition, each of the sections 502, 504, 506, 508, and 510 may have a corresponding selectable button (buttons 512, 514, 516, 518, and 520, respectively) which the clinician may select to enter patient data corresponding to each data type (visual field, BCVA, ERG, AF imagery, and color imagery).
[0041] Figs. 6A-6D illustrate screenshots of example entry forms accessible via selecting one of the buttons 512, 514, 516, 518, and 520. For example, a web browsing application may present a clinician with a visual field entry form 600 in response to the selection of the button 512, a BCVA entry form 610 in response to the selection of the button 514, an ERG entry form 620 in response to the selection of the button 516, and an AF image upload form 630 in response to selection of button 518. The example entry forms 600, 610, and 620 and the example upload form 630 may include any suitable interfaces for the entry or upload of patient data, such as numerical/text/date entry boxes, radio button, buttons triggering file browsing interfaces, etc. After entering or uploading patient data via one of the entry forms 600, 610, or 620 or the upload form 630, a clinician may select one of the respective done editing buttons 640, 650, 660, or 670 to send the entered or uploaded data (e.g., AF images) to the diagnosis server 108. Further, upon sending patient data to the diagnosis server 108, the clinician may be presented with or may have access to a summary of the entered patient data, such as the summary page 700 illustrated in Fig. 7.
[0042] Returning to Fig. 3, after receiving patient data (e.g., via a clinician interacting with a web browsing application), at least some of the entered or uploaded patient data is pre-processed (block 304). By way of example, the pre-processing of (A) imagery data, (B) family history data, and (C) clinical data is discussed below. However, it is understood that any type of data related to inherited retinal disease may be pre- processed to transform the data into a convention form.
[0043] (A) With regard to uploaded imagery data corresponding to a patient, such as AF, OCT, and color images, the diagnosis server 108 may execute the image routine 162 to automatically identify, quantify, and categorize pathological markers of retinal disease based on uploaded images. Further, the diagnosis server 108 may execute the image routine 162 to transform the quantifications and categorizations of pathological markers into one or more feature vectors.
[0044] In one implementation, the diagnosis server 108 uses image processing methods to ensure that uploaded images are directly comparable with one another, with respect to quality, contrast, and gain. Such a process may involve applying a non-linear transformation to some or all of the pixels of an image to compensate for differences in imaging sensor positioning, and illumination conditions. Then, the diagnosis server 108 may apply an additional non-linear function in an attempt to normalize the input images with respect to one another. For example, the image routine 162, when executed by the diagnosis server 108 may use segmentation or histogram shifting in an attempt to normalize a set of uploaded images.
[0045] Image pre-processing may also involve determining regions in each image to evaluate based on the locations of the fovea and optic disc. In order to facilitate such a determination, the diagnosis server 108 may prompt a user (e.g., a clinician) to label the location of multiple structures in the eye on a representative uploaded image via a browser-based label interface. Fig. 8, for example, illustrates a screenshot of a labeling interface 800 displayed within a web browser. In the labeling interface 800, a clinician may select (e.g., via a click or tap) a location of the fovea and optic disc. Once selected, the clinician may click on the update points button 802 to send indications of the selected locations to the diagnosis server 108.
[0046] The example image routine 162 may use indicated locations of the fovea and optic disc to determine the location of the macula, in an implementation. Subsequently, the image routine 162 may utilize the location of the macula, and hence regions outside the macula, in the application of certain filters. Although input of the fovea and optic disc locations via clinician interaction is discussed above, it should be noted that an image routine may use computer vision techniques to automatically detect the location of the fovea and optic disc. Even if a clinician manually indicates the location of either the fovea and/or the optic disc, the image routine 162 may still execute an automated technique to refine and validate such input.
[0047] In some implementations, image pre-processing may include an analysis on four distinct features: (i) the presence or absence of discrete hypofluorescent marks outside the macula; (ii) the presence or absence of hypofluorescence encompassing the macula; (iii) the presence or absence of hyperfluorescent marks outside the macula; and (iv) the presence or absence of a hyperfluorescent ring surrounding the macula (see example ring images included in Fig. 9). Such an analysis may include an objective quantification of these features such as the quantification discussed in U.S. Provisional Application No. 61 /858915 entitled "Automated measurement of changes in retinal, retinal pigment epithelial, or choroidal disease" and filed on July 26, 2013, the entire disclosure of which is hereby incorporated by reference herein. Other methods of object detection may include the analysis of image texture, in some implementations. This analysis of image texture may involve a convolution of images with a sliding window detector, where the sliding window detector incorporates functions that respond uniquely to different spatial frequencies of intensity variation, for example . Further, the image routine 162 may utilize edge detection methods to delineate detected objects represented by consistent areas of a specific texture. Such edge detection methods may utilize established algorithms such as an Active contour model (snakes), Curvelet transforms, or Gabor wavelets, for example.
[0048] In another implementation, image pre-processing may include the extraction of other clinically relevant features. For example, in fundus AF, hyperfluorescent flecks and hypofluorescent flecks (see Fig. 10 for an example image with hyperfluorescent flecks and see Fig. 1 1 for an example image with hypofluorescent flecks) are
distinguishing features of different retinal dystrophies. The image routine 162 may extract these features using a combination of object detection methods to identify blobs, flecks, drusen deposits, and any other additional clinically related features.
[0049] As an output of image pre-processing, the image routine 162 may generate one or more feature vectors representing the imagery data about the patient. For example, the feature vectors may include indications of the total area of
hyperfluorescence and/or hypofluorescence. These "features" may, in some
implementations, be represented by bounded continuous values. However, in other implementations, the image routine 162 may transform image features to discrete Boolean values (e.g., via hyperfluorescent or hyperfluorescent indications compared, or paired, one or more threshold values).
[0050] The image routine 162 may store image feature vectors, representative of the imagery data corresponding to the patient, in a database, such as the output database 1 14. In this manner, the image routine 162 or other routines of the diagnosis server 108 may access the image feature vectors later in the flow of method 300. [0051] (B) With regard to uploaded family history data corresponding to a patient, the diagnosis server 108 may execute the genetics routine 164 to generate probabilities corresponding to genetic modes of transmission.
[0052] In one implementation, the genetics routine 164, when executed by the diagnosis server 108, compares provided family history data to stored rules (e.g., stored in the memory 152a) regarding a variety of modes of transmission. As part of the comparison, the genetics routine 164 may compute a probability of each genetic mode of transmission. By way of example, modes of transmission which may be assessed by the genetics routine 164 include autosomal dominant transmission, autosomal recessive transmission, X-linked recessive transmission, and a simplex case, and the
corresponding rules may be based on clinically accepted probabilities or averages.
[0053] As an output, the genetic routine 164 may generate a numeric probability score corresponding to the transmission of a particular genetic disorder. For example, the genetic routine 164 may store numeric probabilities in a database, such as the output database 1 14.
[0054] (C) With regard to uploaded clinical data corresponding to a patient, the clinical routine 166, when executed by the diagnosis server 108, may generate a clinical feature vector representative of the patient whose patient data has been received by the diagnosis server 108.
[0055] In some implementations, clinical data received at block 302 includes a plurality of clinical parameters, such as ERG parameters (or measurements), visual acuity parameters, and visual field parameters. Based on these parameters, the clinical routine 166 may calculate bounded continuous values and/or discrete Boolean values. For example, ERG parameters are natively a continuous variable, and, therefore, the clinical routine 166 may only normalization the numeric ERG parameters to correct for age and equipment variability.
[0056] Visual acuity (or BCVA) has corresponding descriptive (e.g., 20/50), rather than numeric, parameters. Thus, the clinical routine 166 may translate visual acuity parameters to a known continuous scale, such as the Logarithm of Minimum Angle of Resolution (logMAR) scale. Visual field, on the other hand, is a feature based description (central scotoma, enlarged optic nerve scotoma, decreased range, etc.). To translate this feature based description, the clinical routine 166 may generate a Boolean representation. For example, the Boolean vector of {True, False, True, ...} may represent features that are present in a visual field such as peripheral visual field constriction (e.g., True in this case), central scotoma at the macula (e.g., False in this case), enlarged scotoma at the optic nerve (e.g., True in this case), etc.
[0057] A Boolean representation will allow for later comparison to data in the mutation proven database 1 12, which may have patient information that has been tagged in an identical manner, in an implementation. For example, a routine may compare two given clinical feature vectors with paired True-False values at the same position value in each vector. Such a comparison is further discussed with reference to block 308.
[0058] Further, clinical feature vectors and modes of genetic transmission may be combined to create a single feature vector that is representative of the patient whose clinical data has been input into the system, in an implementation. For example, the clinical routine 166 may utilize numeric probabilities of genetics modes of transmission (e.g., generated from family history data) to weight and combine various feature vectors generated from clinical data (e.g., a BCVA feature vector, a visual field feature vector, and an ERG feature vector).
[0059] As an output, the clinical routine 166 may generate one or more numeric or Boolean based feature vectors representing the patient's clinical data. For example, the clinical routine 166 may store continuous/Boolean vectors in a database, such as the output database 1 14.
[0060] Returning now to Fig. 3, after pre-processing of the patient data,
representations of clinical, demographic, and imagery data (e.g., feature vectors corresponding to the respective type of data or portions of the data itself) are compared with data in the mutation proven database 1 12. For example, the diagnosis routine 168 (executed by the diagnosis server 108) may query the mutation proven database 1 12 against a numerical clinical feature vector, or, in the case of a Boolean feature vector, the diagnosis routine 168 may compare two given feature vectors whose True-False values are paired. Likewise, the diagnosis routine 168 may compare image feature vectors showing specific disease markers to records in the mutation proven database 1 12 corresponding to images tagged with disease markers.
[0061] As a result of the comparison with the mutation proven database 1 12, ranked lists of mutation proven patients are generated (block 308). For example, a comparison of a clinical feature vector with data in the mutation proven database 1 12 may return a ranked list of "nearest neighbors" to the clinical feature vector. In one implementation, the diagnosis routine 168 may execute an approximate nearest neighbor search in an arbitrarily high dimensional space, or other suitable machine learning routine, to identify the entries (i.e., known patients with proven genetic diagnoses) in the mutation proven database 1 12 that have similarities with the clinical/imagery data about the patient and corresponding distances between the patient and the entries in the mutation proven database 1 12.
[0062] In some implementations, ranked lists of matches in the mutation proven database may be ordered lists, whereas, in other implementation, ranked lists of matches may be sorted according to corresponding probabilities. Further, the diagnosis routine 168 may generate one or more ranked lists of matches in the mutation proven database 1 12 for each of the types of data received at block 302. For example, a ranked list of matches may be generated for each of an imagery feature vector, a clinical feature vector, and a plurality of demographic information. The ranked lists of matches may be stored in corresponding data structures of the output database 1 14, such as in the clinical output 132 and the imagery output 134 (block 310)
[0063] Next, multiple ranked lists of matches to proven genetic diagnoses (e.g., one list corresponding to clinical data and one list corresponding to imagery data) are weighted and combined to produce a final ranked listed of genetic diagnoses (block 312). The weights used to combine multiple ranked lists into a final ranked list of genetic diagnoses may be at least partially based on human input (e.g., via weighting of parameters) from clinicians with expertise in inherited retinal diseases. However, the weights used to generate the final ranked list of genetic diagnoses may be refined over time based on further accumulated data in the mutation proven database 1 12 and/or based on further learning of algorithms. For example, expert clinicians may initialize the diagnosis routine 168 with certain weights based on clinical experience. Then, as the diagnosis routine 168 is verified, the weights of the diagnosis routine 168 may be refined. In general, the diagnosis routine 168 may use any suitable supervised or unsupervised machine learning technique to update weights in light of verification results and/or newly accumulated data.
[0064] Because of the collective and flexible nature of the method 300, the method 300 may accommodate variable degrees of input data. If, for example, visual field data is unavailable, the diagnosis routine 168 may still calculate the most likely genetic diagnoses based on other available information, such as BCVA or ERG data. In general, the method 300 may utilize any amount and combination of dissimilar types of data related to inherited retinal disease.
[0065] The diagnosis routine 168 may generate the final ranked list of genetic diagnoses as an ordered list of genetic diagnoses, or the diagnosis routine 168 may calculate a numeric probability (e.g., 90%) corresponding to each genetic diagnosis in the list of genetic diagnoses. As with the weights, the algorithms employed by the diagnosis routine 168 to generate such probabilities may include machine learning algorithms that may be refined after verification and the accumulation of additional data.
[0066] Finally, the final ranked list of likely genetic diagnoses is sent to the end user device, operated by the clinician, to provide an indication of likely causative genes and/or gene mutations corresponding to the patient (block 312). For example, the ranked list of genetic diagnoses may be sent to the end user device as a table of genes and corresponding probabilities ordered according to descending probability.
Subsequently, a clinician may utilize such a ranked list of genetic diagnoses as a guide for genetic screening, such as prioritizing genes for consideration in whole exome sequencing or whole genome sequencing.
[0067] Thus, the method 300, may aid clinicians in rapidly and efficiently diagnosing a patient's retinal condition and guiding genetic testing strategies. For example, a clinician operating the end user device 102 may both rapidly exploit a variety of dissimilar types of data and reduce the logistical complexity of having to screen a large number of genes. Further, the method 300 may provide a clinician sufficient automated capabilities to consider a plurality of likely retinal diseases, even in the absence of specialized medical training in inherited retinal diseases.
[0068] For clarity, Fig. 12 illustrates a screenshot of an example results page 1200 that may be presented to a clinician (e.g., within a web browser) after a ranked list of genetic diagnoses is sent to the end user device 102. The results page includes a brief summary 1202 of information about the patient and a table 1204 of gene-probability pairs. In this example case, the genes are represented by a corresponding gene name (RPGR, RP2, etc.), and the table 1204 may only include genes with a probability, or likelihood, of 1 % or greater. However, it is understood that a ranked list of genes may be presented to a clinician in any suitable representation, such as a pie chart, graph, bar chart, list, etc.
[0069] Moreover, in some implementations, at least some of the diagnoses in a ranked list of genetic diagnoses presented to a clinician may be selectable by the clinician. In this manner, the clinician may view further information about the diagnosis. For example, the diagnosis server 108 may generate diagnosis summaries
corresponding to each of the diagnoses in a ranked list of genetic diagnoses and based on data in the mutation proven database. For example, the diagnosis server 108 may generate a diagnosis summary with trends indicating implications of a particular diagnosis on patients matching the demographics of the patient whose information is under consideration. In general, such summaries may include any representative information about a diagnosis, such as medical book excerpts, parameter trends (e.g., ERG and BCVA trends), exemplary images, etc.

Claims

We claim:
1 . A computer-implemented method for automatically diagnosing inherited retinal disease, the method comprising:
receiving, via a network interface, a plurality of dissimilar types of data from an end user device, the plurality of dissimilar types of data being related to retinal disease and the plurality of dissimilar types of data corresponding to a patient;
pre-processing, by one or more processors, at least one of the plurality of dissimilar types of data to generate a feature vector descriptive of the patient;
for each of the plurality of dissimilar types of data:
comparing, by the one or more processors, portions of the respective type of data or a corresponding feature vector to data in a mutation proven database, wherein the data in the mutation proven database is similar to the respective type of data and wherein the data in the mutation proven database corresponds to a plurality of patients with known genetic diagnoses,
generating, by the one or more processors, a ranked list of matches between the patient and the plurality of patients with known genetic diagnoses, and
storing, by the one or more processors, the ranked list of matches in an output database,
aggregating, by the one or more processors, a plurality of ranked lists of matches in the output database to generate a ranked list of genetic diagnoses corresponding to the patient; and
sending, via the network interface, an indication of the ranked list of genetic diagnoses to the end user device.
2. The computer-implemented method of claim 1 , wherein the plurality of dissimilar types of data includes at least two of demographic data, retinal imagery data, or clinical data about the patient.
3. The computer-implemented method of either claim 1 or 2, further comprising executing, with the one or more processors, an image routine to: identify pathological markers of retinal disease within each of a plurality of images of a retina,
quantify the pathological markers of retinal disease by aligning the plurality of images of the retina and collectively analyzing the aligned plurality of images of the retina, and
categorize the pathological markers of retinal disease into categories each indicative of a particular retinal dystrophy.
4. The computer-implemented method of any one of claims 1 -3, wherein pre-processing at least one of the plurality of dissimilar types of data to generate the feature vector includes generating an imagery feature vector descriptive of the identified, quantified, and categorized pathological markers of retinal disease.
5. The computer-implemented method of any one of claims 1 -4, further comprising.
6. The computer-implemented method of any one of claims 1 -5, further comprising receiving, via the network interface, family history data;
executing, with the one or more processors, a genetics routine to:
determine a possible diagnosis based on the family history data, generate a probability corresponding to the possible diagnosis, and store the probability corresponding to the possible diagnosis in the output database.
7. The computer-implemented method of claim 6, wherein determining a possible diagnosis and generating a probability corresponding to the possible diagnosis includes detecting the presence or absence of a form of transmission in the family history data.
8. The computer-implemented method of claim 6,
wherein the plurality of dissimilar types of data includes clinical data and wherein pre-processing at least one of the plurality of dissimilar types of data to generate the feature vector includes:
converting each of a plurality of clinical parameters in the clinical data to a score, and
combining scores corresponding to the plurality of clinical parameters with the probability corresponding to the possible diagnosis to generate the clinical feature vector representative of the clinical data and the family history data.
9. The computer-implemented method of claim 8, wherein combining scores corresponding to the plurality of clinical parameters with the probability corresponding to the possible diagnosis at least partially based on the probability corresponding to the possible diagnosis.
10. The computer-implemented method of any one of claims 1 -9, wherein the plurality of dissimilar types of data includes clinical data and wherein preprocessing at least one of the plurality of dissimilar types of data to generate the feature vector includes:
converting each of a plurality of clinical parameters in the clinical data to a corresponding score, and
combining scores corresponding to the plurality of clinical parameters to generate the clinical feature vector representative of the clinical data.
1 1 . The computer-implemented method of any one of claims 1 -10, wherein aggregating the plurality of ranked lists of matches in the output database to generate the ranked list of genetic diagnoses corresponding to the patient includes:
assigning a weight to each of the plurality of ranked lists of matches, and combining the plurality of ranked lists of matches based on the assigned weights to generated the ranked list of genetic diagnoses.
12. The computer-implemented method of any one of claims 1 -1 1 , wherein aggregating the plurality of ranked lists of matches in the output database to generate the ranked list of genetic diagnoses includes executing a learning algorithm based on the data in the mutation proven database and the plurality of dissimilar types of data.
13. The computer-implemented method of any one of claims 1 -12, wherein the ranked list of genetic diagnoses includes a list of causative genes and a gene probability corresponding to each causative gene in the list of causative genes.
14. The computer-implemented method of any one of claims 1 -13, further comprising:
receiving, via the network interface, a request for a disease summary
corresponding to one of the diagnoses in the ranked list of genetic diagnoses;
generating, with the one or more processors, a disease summary based on at least some of the data in the mutation proven database, wherein the disease summary includes information indicative of potential implications of the one of the diagnoses at future times; and
sending, via the network interface, the disease summary to the end user device.
15. A computer device for automatically diagnosing inherited retinal disease, the computer device comprising:
one or more processors; and
one or more memories coupled to the one or more processors;
wherein the one or more memories include computer executable instructions stored therein that, when executed by the one or more processors, cause the one or more processors to:
receive, via a network interface, a plurality of dissimilar types of data from an end user device, the plurality of dissimilar types of data being related to retinal disease and the plurality of dissimilar types of data corresponding to a patient; pre-process at least one of the plurality of dissimilar types of data to generate a feature vector descriptive of the patient;
for each of the plurality of dissimilar types of data: compare portions of the respective type of data or a corresponding feature vector to data in a mutation proven database, wherein the data in the mutation proven database is similar to the respective type of data and wherein the data in the mutation proven database corresponds to a plurality of patients with known diagnoses,
generate a ranked list of matches between the patient and the plurality of patients with known diagnoses, and
store the ranked list of matches in an output database, aggregate a plurality of ranked lists of matches in the output database to generate a ranked list of genetic diagnoses corresponding to the patient; and send an indication of the ranked list of genetic diagnoses to the end user device.
PCT/US2014/056891 2013-09-24 2014-09-23 Systems and methods for diagnosing inherited retinal diseases WO2015047981A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361881711P 2013-09-24 2013-09-24
US61/881,711 2013-09-24

Publications (1)

Publication Number Publication Date
WO2015047981A1 true WO2015047981A1 (en) 2015-04-02

Family

ID=52691927

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/056891 WO2015047981A1 (en) 2013-09-24 2014-09-23 Systems and methods for diagnosing inherited retinal diseases

Country Status (2)

Country Link
US (1) US9524304B2 (en)
WO (1) WO2015047981A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109124836A (en) * 2018-09-18 2019-01-04 北京爱康宜诚医疗器材有限公司 The determination method and device of acetabular bone defect processing mode

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11380440B1 (en) * 2011-09-14 2022-07-05 Cerner Innovation, Inc. Marker screening and signal detection
US11869671B1 (en) 2011-09-14 2024-01-09 Cerner Innovation, Inc. Context-sensitive health outcome surveillance and signal detection
US9757023B2 (en) 2015-05-27 2017-09-12 The Regents Of The University Of Michigan Optic disc detection in retinal autofluorescence images
US11610311B2 (en) 2016-10-13 2023-03-21 Translatum Medicus, Inc. Systems and methods for detection of ocular disease
CN106725292B (en) * 2016-12-30 2018-07-24 深圳市新产业眼科新技术有限公司 Multispectral fundus imaging remote diagnosis system and its operation method
AU2018372578A1 (en) 2017-11-27 2020-06-11 Retispec Inc. Hyperspectral image-guided raman ocular imager for Alzheimer's Disease pathologies
IT201800003417A1 (en) * 2018-03-09 2019-09-09 Andrea Sodi A method of storing and processing data from patients suffering from Stargardt's disease
EP3586720A1 (en) * 2018-06-29 2020-01-01 Carl Zeiss Vision International GmbH Method for optimising an optical aid using automatic determination of subjective vision
US20200004928A1 (en) * 2018-06-29 2020-01-02 Roche Sequencing Solutions, Inc. Computing device with improved user interface for interpreting and visualizing data
US10978189B2 (en) * 2018-07-19 2021-04-13 Optum, Inc. Digital representations of past, current, and future health using vectors

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040115646A1 (en) * 2002-12-13 2004-06-17 Affymetrix, Inc. Methods, computer software products and systems for correlating gene lists
US20110160562A1 (en) * 2009-12-02 2011-06-30 De Oliveira E Ramos Joao Diogo Methods and Systems for Detection of Retinal Changes
WO2013058907A1 (en) * 2011-10-17 2013-04-25 Good Start Genetics, Inc. Analysis methods
US20130184161A1 (en) * 2009-10-22 2013-07-18 Stephen F. Kingsmore Methods and Systems for Medical Sequencing Analysis
US20130208245A1 (en) * 2010-05-05 2013-08-15 Melanie Crombie Williams Campbell Method and system for imaging amyloid beta in the retina of the eye in association with alzheimer's disease

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5692220A (en) * 1993-09-02 1997-11-25 Coulter Corporation Decision support system and method for diagnosis consultation in laboratory hematopathology
US5868134A (en) * 1993-09-21 1999-02-09 Kabushiki Kaisha Topcon Retinal disease analyzer
US20040122709A1 (en) * 2002-12-18 2004-06-24 Avinash Gopal B. Medical procedure prioritization system and method utilizing integrated knowledge base
US7756309B2 (en) * 2005-07-27 2010-07-13 Bioimagene, Inc. Method and system for storing, indexing and searching medical images using anatomical structures of interest
WO2009064911A2 (en) * 2007-11-13 2009-05-22 The Regents Of The University Of Michigan Method and apparatus for detecting diseases associated with the eye
US8820931B2 (en) * 2008-07-18 2014-09-02 Doheny Eye Institute Optical coherence tomography-based ophthalmic testing methods, devices and systems
US20120102405A1 (en) * 2010-10-25 2012-04-26 Evidence-Based Solutions, Inc. System and method for matching person-specific data with evidence resulting in recommended actions

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040115646A1 (en) * 2002-12-13 2004-06-17 Affymetrix, Inc. Methods, computer software products and systems for correlating gene lists
US20130184161A1 (en) * 2009-10-22 2013-07-18 Stephen F. Kingsmore Methods and Systems for Medical Sequencing Analysis
US20110160562A1 (en) * 2009-12-02 2011-06-30 De Oliveira E Ramos Joao Diogo Methods and Systems for Detection of Retinal Changes
US20130208245A1 (en) * 2010-05-05 2013-08-15 Melanie Crombie Williams Campbell Method and system for imaging amyloid beta in the retina of the eye in association with alzheimer's disease
WO2013058907A1 (en) * 2011-10-17 2013-04-25 Good Start Genetics, Inc. Analysis methods

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109124836A (en) * 2018-09-18 2019-01-04 北京爱康宜诚医疗器材有限公司 The determination method and device of acetabular bone defect processing mode

Also Published As

Publication number Publication date
US20150088870A1 (en) 2015-03-26
US9524304B2 (en) 2016-12-20

Similar Documents

Publication Publication Date Title
US9524304B2 (en) Systems and methods for diagnosing inherited retinal diseases
Kawahara et al. Seven-point checklist and skin lesion classification using multitask multimodal neural nets
US11631175B2 (en) AI-based heat map generating system and methods for use therewith
CN110349156B (en) Method and device for identifying lesion characteristics in fundus picture and storage medium
Li et al. Deep learning-based automated detection of glaucomatous optic neuropathy on color fundus photographs
Ramasamy et al. Detection of diabetic retinopathy using a fusion of textural and ridgelet features of retinal images and sequential minimal optimization classifier
Niemeijer et al. Automated detection and differentiation of drusen, exudates, and cotton-wool spots in digital color fundus photographs for diabetic retinopathy diagnosis
Abramoff et al. Automated detection of diabetic retinopathy: barriers to translation into clinical practice
Wu et al. Gamma challenge: glaucoma grading from multi-modality images
EP2812828B1 (en) Interactive optimization of scan databases for statistical testing
Prasad et al. Multiple eye disease detection using Deep Neural Network
Coan et al. Automatic detection of glaucoma via fundus imaging and artificial intelligence: A review
Jordan et al. A review of feature-based retinal image analysis
US20230162362A1 (en) Method and system for estimating early progression of dementia from human head images
CN111508603A (en) Birth defect prediction and risk assessment method and system based on machine learning and electronic equipment
US20220133215A1 (en) Method for evaluating skin lesions using artificial intelligence
US20220375610A1 (en) Multi-Variable Heatmaps for Computer-Aided Diagnostic Models
Güven Automatic detection of age-related macular degeneration pathologies in retinal fundus images
Murugan et al. An abnormality detection of retinal fundus images by deep convolutional neural networks
Nisha et al. A novel method to improve inter-clinician variation in the diagnosis of retinopathy of prematurity using machine learning
Bali et al. Analysis of Deep Learning Techniques for Prediction of Eye Diseases: A Systematic Review
Zhu et al. Detecting abnormality in optic nerve head images using a feature extraction analysis
Sridhar et al. Artificial intelligence in medicine: diabetes as a model
Orlando et al. Retinal blood vessel segmentation in high resolution fundus photographs using automated feature parameter estimation
Bhambra et al. Deep learning for ultra-widefield imaging: A scoping review

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14849360

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14849360

Country of ref document: EP

Kind code of ref document: A1