US10996813B2 - Digital treatment planning by modeling inter-arch collisions - Google Patents

Digital treatment planning by modeling inter-arch collisions Download PDF

Info

Publication number
US10996813B2
US10996813B2 US16/457,754 US201916457754A US10996813B2 US 10996813 B2 US10996813 B2 US 10996813B2 US 201916457754 A US201916457754 A US 201916457754A US 10996813 B2 US10996813 B2 US 10996813B2
Authority
US
United States
Prior art keywords
model
treatment
view
teeth
subject
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/457,754
Other versions
US20200004402A1 (en
Inventor
Svetlana Makarenkova
Artem KUANBEKOV
Aleksandr Zhulin
Boris LIKHTMAN
Vladimir Grenaderov
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Align Technology Inc
Original Assignee
Align Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Align Technology Inc filed Critical Align Technology Inc
Priority to US16/457,754 priority Critical patent/US10996813B2/en
Assigned to ALIGN TECHNOLOGY, INC. reassignment ALIGN TECHNOLOGY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIKHTMAN, Boris, ZHULIN, ALEKSANDR, GRENADEROV, Vladimir, KUANBEKOV, ARTEM, MAKARENKOVA, Svetlana
Publication of US20200004402A1 publication Critical patent/US20200004402A1/en
Priority to US17/246,547 priority patent/US11449191B2/en
Application granted granted Critical
Publication of US10996813B2 publication Critical patent/US10996813B2/en
Priority to US17/945,957 priority patent/US11809214B2/en
Priority to US18/472,209 priority patent/US12067210B2/en
Priority to US18/770,614 priority patent/US20240361879A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C7/00Orthodontics, i.e. obtaining or maintaining the desired position of teeth, e.g. by straightening, evening, regulating, separating, or by correcting malocclusions
    • A61C7/002Orthodontic computer assisted systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C19/00Dental auxiliary appliances
    • A61C19/04Measuring instruments specially adapted for dentistry
    • A61C19/05Measuring instruments specially adapted for dentistry for determining occlusion

Definitions

  • Embodiments of the invention relate generally to systems and method for the visualization of teeth.
  • Orthodontic devices such as aligners, palatal expanders, retainers, and dental implants can be used to adjust the position of teeth and to treat various dental irregularities.
  • a three-dimensional (3D) digital model of the subject's teeth, dentition, and gingiva can be constructed from a 3D scan of the subject's mouth, teeth, dentition, and gingiva.
  • the 3D model of the subject's teeth and dentition can be displayed graphically to the doctor on a display using a computing system with memory and software.
  • the methods and apparatuses may relate to orthodontic treatment planning, including the visualization of teeth for modifying, enhancing and improving treatment plans.
  • described herein are methods and apparatuses for reviewing, analyzing and/or modifying orthodontic treatment plans. These methods may include one or more user interfaces that are configured to improve review and modification of orthodontic treatment planning.
  • a method may include: displaying a staging toolbar on a first portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of three dimensional (3D) digital models of a subject's dentition, wherein the plurality of 3D models includes a first model that shows an arrangement of a subject's teeth before receiving a treatment, one or more intermediate models that show an arrangement of the subject's teeth during a stage of the treatment, and a final model that shows an arrangement of the subject's teeth after receiving the treatment; displaying a displayed 3D model corresponding to one of the plurality of 3D models; changing the displayed 3D model to correspond to whichever digital button is selected by a user; adjusting a view of the displayed 3D model shown on the display based on a user input, wherein the view of the displayed 3D model is applied to the changed displayed 3D model as the user selects the digital buttons.
  • 3D three dimensional
  • the staging toolbar may be a virtual toolbar including the plurality of digital buttons, which may be arranged within the display (screen, touchscreen, virtual display, augmented reality display, etc.).
  • the staging toolbar may be an arrangement of virtual buttons on the top, side(s) and/or bottom of the display that may be selected by a user, e.g., by clicking on them or otherwise selecting them.
  • the digital model of the subject's dentition may include a 3D surface (or in some variations surface and volumetric) model of the subject's upper and/or lower arch, including teeth and in some variations gingiva (e.g., particularly the portion of gingiva around the teeth).
  • the 3D model may be segmented into individual teeth that may be separately selected and/or moved by the user or system.
  • the system or method may store user inputs and/or generate user output, e.g., modifications to the display, based on user selections from the controls and the processing by the system.
  • At least some of the digital buttons may correspond to overcorrection stages.
  • the digital buttons that correspond to the overcorrection stages may be hidden or revealed by user-controlled switch (e.g., a virtual button on the display that allows the user to toggle between showing and hiding the overcorrection stages), and/or selecting the overcorrecting stages as part of an actual treatment plan.
  • the user input may include adjustments to the display of the 3D model, including one or more of: a rotation, a zoom, or a pan of the displayed 3D model. Additional tools may include showing the surface of the 3D model, showing a wireframe of the 3D model, changing the color of the 3D model, etc.
  • the user input may include selecting from a set of preset views, such as showing the 3D model of the upper and/or lower jaw in a frontal view, a left side view, a right side view, a back view, etc.
  • the user input may include selecting to display or hide on the view of the teeth of the 3D model one or more of: tooth numbering, attachments, interproximal reduction spacing, and pontics.
  • These display options may be separately controlled, e.g., by including one or more virtual controls (e.g., buttons, switches, etc.) that allows the selection of each of these features individually or collectively.
  • the system may include processing to determine or suggest one or more of these features (e.g., determining automatically or semi-automatically tooth numbering, position, number and/or orientation of attachments, hooks, ramps, etc.
  • Changing the displayed 3D model to correspond to whichever digital button is selected by a user may include calculating the viewing angle and magnification from a current displayed 3D model and applying the calculated viewing angle and magnification to a new 3D model from the plurality of 3D models corresponding to the digital button selected by the user.
  • systems configured to perform any of the methods described herein. These systems may include one or more processors and may include a memory coupled to the one or more processors configured to store computer-program instructions that, when executed by the one or more processors, perform the methods.
  • a system may include: one or more processors; a memory coupled to the one or more processors, the memory configured to store computer-program instructions, that, when executed by the one or more processors, perform a computer-implemented method comprising: displaying a staging toolbar on a first portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of three dimensional (3D) digital models of a subject's dentition, wherein the plurality of 3D models includes a first model that shows an arrangement of a subject's teeth before receiving a treatment, one or more intermediate models that show an arrangement of the subject's teeth during a stage of the treatment, and a final model that shows an arrangement of the subject's teeth after receiving the treatment; displaying a displayed 3D model corresponding to one of the plurality of 3D models; changing the displayed 3D model to
  • any of the methods and apparatuses described herein may be configured to also or alternatively display multiple 3D models, including multiple treatment plans (each having the same or a different number of treatment stages), and/or display a 3D model (e.g., surface model) of an initial (unmodified) arrangement of the subject's teeth with one or more treatment plans (each having multiple treatment stages).
  • the system and method may enhance review of the treatment plan(s) by allowing the user to make changes in the appearance (angle, zoom, pan, etc.), and/or selection of a displayed treatment stage when displaying multiple treatment plans, of one of the displayed 3D models and concurrently making the same (or similar) changes in the other treatment plans.
  • any of these systems and methods may also address the problem of complexity associated with the display of one or more treatment plans, in which each treatment plan includes a large number of stages, and multiple potential ‘treatments’ at each stage, such as changes in the tooth position, angle, etc., as well as the components of the treatment applied or to be applied, such as interproximal reduction, extraction, ramps, attachments, hooks, etc. These components may be different at different stages of each treatment plan and may be widely different or similar between different treatment plans.
  • the methods an apparatuses may provide simplified techniques for controlling the otherwise complicated and information-dense displays. For example, in some variations the methods and apparatuses may include informative controls that allow toggling of display options on or off, but may also include information about these features being displayed or features related to those being displayed.
  • a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment
  • a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment
  • the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment
  • adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model
  • displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages of the first orthodontic treatment
  • changing the displayed stage of the second 3D model to correspond to a stage selected by the user from the staging toolbar.
  • buttons may also include displaying a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate the presence of the treatment feature in the first orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model.
  • the plurality of treatment features may include: tooth numbering, attachments, interproximal reduction spacing, and pontics.
  • the user input may include one or more of: a rotation, a zoom, or a pan of the displayed 3D models. In some variations, the user input includes selecting from a set of preset views.
  • a method may include: displaying, side-by-side on a display, a plurality of three dimensional (3D) models of a subject's dentition, wherein the plurality of 3D models includes two or more of: a first 3D model that shows an arrangement of the subject's teeth before receiving a treatment, a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment, and a third 3D model that shows an arrangement of the subject's teeth subject to a second orthodontic treatment; wherein when either or both the first 3D model and the second 3D model are displayed, the first 3D model and the second 3D model are displayed at either a final stage or an intermediate treatment stage; and adjusting a view of all of the displayed 3D models based on a user input modifying one of the plurality of 3D models.
  • 3D three dimensional
  • any of these methods may include displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages, and wherein when either or both the first 3D model and the second 3D model are displayed, changing the displayed stage to correspond to a stage selected by the user from the staging toolbar.
  • the method or apparatus may include displaying a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate the presence of the treatment feature in the first orthodontic treatment or the second orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model.
  • the plurality of treatment features may comprises: tooth numbering, attachments, interproximal reduction spacing, hooks, ramps, pontics, etc.
  • the user input may include one or more of: a rotation, a zoom, or a pan of the displayed 3D models.
  • the user input may include selecting from a set of preset views.
  • a system for visualizing a subject's teeth comprising: one or more processors; a memory coupled to the one or more processors, the memory configured to store computer-program instructions, that, when executed by the one or more processors, perform a computer-implemented method comprising: displaying, side-by-side on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, and a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment; wherein the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment; adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model; displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages
  • a system for visualizing a subject's teeth comprising: one or more processors; a memory coupled to the one or more processors, the memory configured to store computer-program instructions, that, when executed by the one or more processors, perform a computer-implemented method comprising: displaying, side-by-side on a display, a plurality of three dimensional (3D) models of a subject's dentition, wherein the plurality of 3D models includes two or more of: a first 3D model that shows an arrangement of the subject's teeth before receiving a treatment, a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment, and a third 3D model that shows an arrangement of the subject's teeth subject to a second orthodontic treatment; wherein when either or both the first 3D model and the second 3D model are displayed, the first 3D model and the second 3D model are displayed at either a final stage or an intermediate treatment stage; adjusting a view of all of the displayed 3D models based on a user
  • any of the systems and methods above may include this feature, which may be a user-selectable control (e.g., button, etc.), such as a virtual button that switches a view of the 3D model(s), such as a frontal and/or side view of one or more dental arches (e.g., upper and/or lower arches) to an occlusal view showing the occlusal surfaces of the upper and/or lower dental arches.
  • a user-selectable control e.g., button, etc.
  • a virtual button that switches a view of the 3D model(s)
  • a frontal and/or side view of one or more dental arches e.g., upper and/or lower arches
  • the occlusal surface may include indicator(s) of the severity of contact (collision) between the upper and lower jaw in normal intercuspation of the teeth.
  • the severity of contact may be shown as a threshold, showing two states, “low” or “normal” contact and “high” or “severe” contact.
  • contact/collision may be shown as a heat map indicating a scaled degree of contact/collision and/or an annotated indicator (numerical, alphanumeric, etc.) indicating the contact severity.
  • a method may include: displaying, on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, and a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment; wherein the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment; adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model; switching the view of both the first 3D model and the second 3D model to an occlusal view when the user selects a control, wherein the occlusal view shows occlusal surfaces of teeth in the first 3D model and the second 3D model, further wherein the occlusal view indicates one or more regions of inter-arch collisions on either or both of the first 3D model and the second 3D model.
  • 3D three dimensional
  • the occlusal view may indicate one or more regions of inter-arch collisions using an indicator that is scaled to differentiate a relative degree of contact between an upper arch and a lower arch.
  • the indicator may be colored differently to differentiate regions of normal contact from regions of high contact (e.g., normal contact in green, high contact in red).
  • the method or system may be configured to calculate the regions of inter-arch collision on the first 3D model and calculating regions of inter-arch collision on region on the second 3D model.
  • the method or system may further set a threshold (or may apply a user-specified threshold) of contact degree to differentiate normal from high.
  • the user interface may include a dial or slider that allows selection of the degree of contact.
  • the method may include displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages of the first orthodontic treatment; and changing the displayed stage of the second 3D model to correspond to a stage selected by the user from the staging toolbar.
  • the different 3D models may be displayed side-by-side, in either the same or different windows.
  • the method or system may include displaying an upper arch engaged with a lower arch of the first 3D model and displaying an upper arch engaged with a lower arch of the second 3D model.
  • Any of these methods or apparatuses may be configured to display a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate the presence of the treatment feature in the first orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model.
  • the plurality of treatment features may comprise: tooth numbering, attachments, interproximal reduction spacing, hooks, ramps (e.g., bite ramps), pontics, etc.
  • Adjusting the view of both of the first 3D model and the second 3D model based on the user input may include modifying one or more of: a rotation, a zoom, or a pan of the first 3D model and the second 3D model.
  • these viewing options may be concurrently adjusted (translating the user adjustments in the display parameters of one 3D model to the other 3D model, etc., typically in real time).
  • the user input may include selecting from a set of preset views.
  • a method may include: displaying, on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, and a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment; wherein the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment; adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model, wherein adjusting the view comprises adjusting one or more of the rotation, zoom and pan; switching the view of both the first 3D model and the second 3D model to an occlusal view when the user selects a control, wherein the occlusal view shows occlusal surfaces of teeth in the first 3D model and the second 3D model, further wherein the occlusal view indicates one or more regions of inter-arch collisions on either or both of
  • the indicator may be colored or marked differently to differentiate regions of normal contact from regions of high contact.
  • the indicator maybe a region associated with the button (e.g., within a boundary of the button), such as a box, circle, dot, etc., on the button and/or a marking on the button, including the text used to indicate the primary function of the button (e.g., attachments, IPR, pontics, extraction(s), etc.).
  • Any of these methods or apparatuses may include calculating regions of inter-arch collision on the first 3D model and calculating regions of inter-arch collision on region on the second 3D model, as part of an inter-arch collision calculator module. Any of these methods or apparatuses may also or alternatively include displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages of the first orthodontic treatment; and changing the displayed stage of the second 3D model to correspond to a stage selected by the user from the staging toolbar.
  • any of the methods an apparatuses may include side-by-side, and/or displaying the upper arch engaged with the lower arch of the first 3D model and displaying the upper arch engaged with the lower arch of the second 3D model (e.g., showing the upper and lower arch intercuspating).
  • any of these methods and apparatuses may include displaying a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate the presence of the treatment feature in the first orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model.
  • the plurality of treatment features may include: tooth numbering, attachments, interproximal reduction spacing, ramps, hooks, pontics, etc.
  • Adjusting the view of both of the first 3D model and the second 3D model based on the user input may include modifying one or more of: a rotation, a zoom, or a pan of the first 3D model and the second 3D model.
  • the user input may include selecting from a set of preset views.
  • Also described herein are systems for visualizing a subject's teeth may include: one or more processors; a memory coupled to the one or more processors, the memory configured to store computer-program instructions, that, when executed by the one or more processors, perform a computer-implemented method comprising: displaying, on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, and a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment; wherein the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment; adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model; switching the view of both the first 3D model and the second 3D model to an occlusal view when the user selects a control, wherein the occlusal view shows occ
  • systems for visualizing a subject's teeth that include: one or more processors; a memory coupled to the one or more processors, the memory configured to store computer-program instructions, that, when executed by the one or more processors, perform a computer-implemented method comprising: displaying a staging toolbar on a first portion of a display, wherein the staging toolbar comprises one or more digital buttons that corresponding to a plurality of three dimensional models, wherein the plurality of three dimensional models include includes a first model that shows the subject's teeth position before receiving treatment, one or more intermediate models that show the subject's teeth position during treatment, and a final model that shows the subject's teeth position after receiving treatment; presenting a multiple view digital button on a second portion of the display; and presenting on the display both the first model and a second model when the multiple view digital button is selected by a user, wherein the second model is selected from one of the intermediate models or the final model.
  • the second model may be determined by a selection of one of the digital button of the staging toolbar.
  • the computer-implemented method may further comprise: presenting a feature button on the display; and presenting, on the display, the feature on the three dimensional model of the subject's teeth.
  • the feature button may be an attachment button, wherein the computer-implemented method further comprises presenting, on the display, attachments on the subject's teeth when the attachment button is selected.
  • the feature button may be a pontics button, wherein the computer-implemented method further comprises presenting on the display, pontics on the subject's teeth when the pontics button is selected.
  • the feature button may be an interproximal reduction and space management button, wherein the computer-implemented method further comprises presenting, on the display, interproximal reduction and space management data on the subject's teeth when the interproximal reduction and space management button is selected.
  • Any of these methods may also include: presenting an on state for the feature button when the feature button is selected by the user; presenting an off state for the feature button when the feature button is not selected by the user, wherein the on state is visually distinguishable from the off state; and presenting the feature on the three dimensional model when the feature button is in the on state; not presenting the feature on the three dimensional model when the feature button is in the off state; and presenting an indicator associated with the feature button that indicates whether the feature is present or absent from the treatment.
  • the staging toolbar may include one or more hidden overcorrection stages, wherein the computer-implemented method further comprises presenting the one or more hidden overcorrection stages when the user clicks or selects a button next to the staging toolbar or integrated into one end of the staging toolbar.
  • the feature button may be an occlusal button, wherein the computer-implemented method further comprises presenting, on the display, occlusal contacts on the subject's teeth when the occlusal button is selected.
  • the occlusal contacts may comprise normal occlusal contacts that are shown in a first color and heavy inter-arch collisions that are shown in a second color.
  • Any of these methods may include: storing in a memory, a plurality of three dimensional models of the subject's teeth, wherein the plurality of three dimensional models includes a first model that shows the subject's teeth position before receiving treatment, one or more intermediate models that show the subject's teeth position during treatment, and a final model that shows the subject's teeth position after receiving treatment.
  • the method may further include displaying, using the processor, a staging toolbar on a first portion of the display, wherein the staging toolbar comprises one or more digital buttons that corresponding to the plurality of three dimensional models; displaying, using the processor, a multiple view digital button on a second portion of the display; and displaying, using the processor, on the display both the first model and a second model when the multiple view digital button is selected by a user, wherein the second model is selected from one of the intermediate models or the final model.
  • the method may include: displaying, using the processor, a feature button on the display; and displaying, using the processor, on the display the feature on the three dimensional model of the subject's teeth.
  • the feature button may be an attachment button, wherein the method may further comprise displaying, using the processor, on the display attachments on the subject's teeth when the attachment button is selected.
  • the feature button may be a pontics button, wherein the method may further comprise displaying, using the processor, on the display pontics on the subject's teeth when the pontics button is selected.
  • the feature button may be an interproximal reduction and space management button, wherein the method may further comprise displaying, using the processor, interproximal reduction and space management data on the subject's teeth when the interproximal reduction and space management button is selected.
  • the method may include displaying, using a processor, an on state for the feature button when the feature button is selected by the user; displaying, using a processor, an off state for the feature button when the feature button is not selected by the user, wherein the on state is visually distinguishable from the off state; and displaying, using a processor, the feature on the three dimensional model when the feature button is in the on state; not displaying, using a processor, the feature on the three dimensional model when the feature button is in the off state; and displaying, using a processor, an indicator associated with the feature button that indicates whether the feature is present or absent from the treatment.
  • the staging toolbar may comprise one or more hidden overcorrection stages, wherein method further comprises displaying the one or more hidden overcorrection stages when the user clicks or selects a button next to the staging toolbar or integrated into one end of the staging toolbar.
  • the feature button may be an occlusal button, wherein the method further comprises displaying on the display occlusal contacts on the subject's teeth when the occlusal button is selected.
  • a method of visualizing a subject's teeth on a display with a processor may include: storing in a memory, a plurality of three dimensional models of the subject's teeth, wherein the plurality of three dimensional models includes a first model that shows the subject's teeth position before receiving treatment, one or more intermediate models that show the subject's teeth position during treatment, and a final model that shows the subject's teeth position after receiving treatment; displaying, using the processor, a staging toolbar on a first portion of the display, wherein the staging toolbar comprises one or more digital buttons that corresponding to the plurality of three dimensional models; displaying, using the processor, a multiple view digital button on a second portion of the display; and displaying, using the processor, on the display both the first model and a second model when the multiple view digital button is selected by a user, wherein the second model is selected from one of the intermediate models or the final model.
  • FIG. 1A is a diagram illustrating one example of a treatment plan review and/or modification system as described herein.
  • FIG. 1B is a diagram illustrating one example of an occlusal contact engine as described herein.
  • FIG. 1C schematically illustrates one example of a method of treatment plan review and/or modification as described herein.
  • FIG. 1D schematically illustrates one example of a method of treatment plan review and/or modification including occlusal contact severity.
  • FIGS. 2A-2C illustrate dual and single view display modes for viewing 3D models of a subject's teeth at various stages of a treatment plan.
  • FIG. 3 illustrates a dual view that can be combined with an occlusal view.
  • FIG. 4 illustrates the ability to hide and visualize features in dual view.
  • FIG. 5A-5G illustrate viewing multiple treatment plans and various features that can be used during treatment.
  • FIGS. 6A-6C illustrate an occlusal view button with three indicator states.
  • FIGS. 7A and 7B illustrate how to switch between a view of multiple treatment plans and a view of a single treatment plan.
  • FIGS. 8A and 8B illustrate overcorrection stages that can be hidden and unhidden.
  • FIG. 9 illustrates a treatment form that can be used to prompt the doctor about overcorrection stages.
  • FIG. 10 illustrates an embodiment of a multiple treatment plan view with occlusal view switched on to display occlusal contacts.
  • FIGS. 11A and 11B illustrate that the multiple treatment plan view can be toggled between a closed mouth view and an open mouth occlusal view.
  • FIG. 12 illustrates that rotation of a 3D model in the multiple treatment plan view simultaneously rotates the other 3D models such that all the models are presented at the same viewing angle and perspective.
  • FIGS. 13A-13C illustrate various single treatment plan views with the occlusal view switched on.
  • FIGS. 14A and 14B illustrate various dual views with the occlusal view switched on.
  • Orthodontic devices such as aligners, palatal expanders, retainers, and dental implants can be used to adjust the position of teeth and to treat various dental irregularities.
  • a 3D digital model of the subject's teeth, dentition, and gingiva can be constructed from a 3D scan of the subject's mouth, teeth, dentition, and gingiva.
  • the 3D model of the subject's teeth and dentition can be displayed graphically to the doctor on a display using a computing system with memory and software.
  • Input devices such as a mouse and/or keyboard allows the doctor to manipulate the 3D model.
  • the systems and methods described herein are particularly well suited to be used in procedures involving aligners, but the systems and methods are also suitable for use with staging other types of orthodontic devices.
  • the 3D model of the can be rotated in any axis and can be zoomed into and out as desired.
  • Each individual tooth can be a separate object in the 3D model that can be manipulated by the doctor.
  • the doctor can manipulate the teeth using the input devices into a desired final teeth position.
  • the computer system can then determine the appropriate intermediate stages that can be used to move the teeth from the initial teeth position to the final teeth position.
  • the initial, final, and intermediate teeth position stages can be displayed to the doctor.
  • One or more graphical toolbars can be displayed to the doctor to facilitate making adjustments to the teeth and arch.
  • the toolbars can have buttons to perform various actions to the 3D model.
  • one toolbar can have buttons that allow the doctor to manipulate the viewing angle and perspective of the 3D model.
  • Another toolbar can have buttons for making various tooth adjustments, such as extrusion/intrusion, bucco-lingual translation, mesio-distal translation, rotation, crown angulation, bucco-lingual root torque, bucco-lingual crown tip, and mesio-distal crown tip.
  • tooth adjustments such as extrusion/intrusion, bucco-lingual translation, mesio-distal translation, rotation, crown angulation, bucco-lingual root torque, bucco-lingual crown tip, and mesio-distal crown tip.
  • the doctor can lock and keep a particular tooth at a desired position, and designate a tooth as unmovable for the duration of a treatment (e.g. crowns, implants).
  • buttons for making attachments and precision cuts can have buttons for making attachments and precision cuts. These buttons allow the doctor to add conventional attachments and precision cuts by simply dragging and dropping the attachment or cut to the tooth of choice, and the doctor can easily remove attachments and precision cuts by dragging them to the trash can.
  • the buttons also allow the doctor to adjust the placement and rotate conventional attachments, and change the size, prominence and degree of beveling of rectangular attachments.
  • the button allow the doctor to fine-tune the mesiodistal position of button cutouts.
  • Another toolbar can have buttons for posterior arch expansion and contraction. This toolbar allows the doctor to expand or contract posterior arches by expanding or contracting the upper arch only, the lower arch only, or both arches. As above, when an arch modification is made on the 3D model, some or all other teeth in the adjusted arch will automatically adjust in response.
  • Another toolbar can have buttons for interproximal reduction (IPR) and space management.
  • IPR interproximal reduction
  • the doctor can choose to (1) select the auto adjust option: IPR and space automatically adjusts as you make adjustments on the 3D model; (2) select the keep current option: to preserve the current IPR configuration; (3) select the no IPR option: all existing IPR will be removed, and no IPR will be added; and (4) manually adjust IPR and space on the 3D model (add, remove or lock for specific teeth).
  • Additional features that can be include in a toolbar include occlusal contacts, which identifies and displays to the doctor all or a subset of inter-arch occlusal contacts, and resolves heavy occlusal contacts directly on the 3D model.
  • Another feature can be dual view, where modifications made using 3D Controls may be compared side-by-side with the original set up.
  • Another feature is a Bolton analysis tool that provides reference information pertaining to tooth size discrepancy that is useful for planning how to address tooth interdigitation and arch coordination.
  • Another tool positions the 3D model on a grid that allows linear tooth movements to be measured and provides more precise control to the doctor to make measurements on the 3D model.
  • Another feature is a superimposition toolbar that superimposes tooth position at any stage in relation to tooth position at any other stage, and control which stage is blue (or another color) and which stage is white (or another different color) for a better visualization between stages.
  • the display can provide the doctor a dual view that shows and compares in one screen or display a first 3D model of the teeth position before the treatment (initial malocclusion) with a second 3D model of the teeth position at any stage of the treatment, such as an intermediate stage or the final stage.
  • FIG. 2A illustrates in the first 3D model 200 on the left side of the display the initial malocclusion, while the right side of the display shows a 3D model 202 of the dentition at the 10 th stage of treatment.
  • a toolbar 204 at the bottom of the screen allows the doctor to select the stages to be displayed in dual view. The stages to be viewed can be selected by simply clicking the corresponding button on the toolbar 204 .
  • a default stage that is typically shown is the initial stage that shows the initial malocclusion.
  • the default stage can be changed to an intermediate stage.
  • the doctor can drag and drop a button representing one of the intermediate stages over the default 3D model shown on the left side of the screen in order to replace the initial stage with an intermediate stage. Using dual view the doctor can understand how the treatment goes from stage to stage in comparison to the initial malocclusion.
  • the first 3D model can be an intermediate stage and the second 3D model can be a subsequent intermediate stage or the final stage.
  • Using additional tools in the dual view gives the doctor additional details for the treatment in comparison to the initial malocclusion and teeth position and allows the doctor to view the effect of a particular action on teeth movement on any particular stage and allows the comparison between the initial malocclusion with any stage of the treatment.
  • Any of the tools described herein can be used in dual view to manipulate either of the 3D models shown in dual view.
  • using the occlusal view tool when in dual view gives the doctor the ability to view and compare maxillary and mandibular occlusal view for the initial malocclusion with the maxillary and mandibular occlusal view at any stage of the treatment.
  • FIG. 1A is a diagram showing an example of a treatment plan review and/or modification system 100 A.
  • the modules of the system 100 A may include one or more engines and datastores.
  • a computer system can be implemented as an engine, as part of an engine or through multiple engines.
  • an engine includes one or more processors or a portion thereof.
  • a portion of one or more processors can include some portion of hardware less than all of the hardware comprising any given one or more processors, such as a subset of registers, the portion of the processor dedicated to one or more threads of a multi-threaded processor, a time slice during which the processor is wholly or partially dedicated to carrying out part of the engine's functionality, or the like.
  • a first engine and a second engine can have one or more dedicated processors or a first engine and a second engine can share one or more processors with one another or other engines.
  • an engine can be centralized or its functionality distributed.
  • An engine can include hardware, firmware, or software embodied in a computer-readable medium for execution by the processor.
  • the processor transforms data into new data using implemented data structures and methods, such as is described with reference to the figures herein.
  • the engines described herein, or the engines through which the systems and devices described herein can be implemented, can be cloud-based engines.
  • a cloud-based engine is an engine that can run applications and/or functionalities using a cloud-based computing system. All or portions of the applications and/or functionalities can be distributed across multiple computing devices, and need not be restricted to only one computing device.
  • the cloud-based engines can execute functionalities and/or modules that end users access through a web browser or container application without having the functionalities and/or modules installed locally on the end-users' computing devices.
  • datastores are intended to include repositories having any applicable organization of data, including tables, comma-separated values (CSV) files, traditional databases (e.g., SQL), or other applicable known or convenient organizational formats.
  • Datastores can be implemented, for example, as software embodied in a physical computer-readable medium on a specific-purpose machine, in firmware, in hardware, in a combination thereof, or in an applicable known or convenient device or system.
  • Datastore-associated components such as database interfaces, can be considered “part of” a datastore, part of some other system component, or a combination thereof, though the physical location and other characteristics of datastore-associated components is not critical for an understanding of the techniques described herein.
  • Datastores can include data structures.
  • a data structure is associated with a particular way of storing and organizing data in a computer so that it can be used efficiently within a given context.
  • Data structures are generally based on the ability of a computer to fetch and store data at any place in its memory, specified by an address, a bit string that can be itself stored in memory and manipulated by the program.
  • Some data structures are based on computing the addresses of data items with arithmetic operations; while other data structures are based on storing addresses of data items within the structure itself.
  • Many data structures use both principles, sometimes combined in non-trivial ways.
  • the implementation of a data structure usually entails writing a set of procedures that create and manipulate instances of that structure.
  • the datastores, described herein, can be cloud-based datastores.
  • a cloud-based datastore is a datastore that is compatible with cloud-based computing systems and engines.
  • An example of a treatment plan review and/or modification system 100 A such as that shown in FIG. 1A may include a computer-readable medium 102 , view modification user input engine(s) 104 , staging toolbar engine(s) 106 , display output engine(s) 108 , an untreated 3D model datastore 110 , one or more treated 3D model datastore(s) 112 , an overcorrection stage engine 114 , an informative button engine 116 , and an occlusal contact engine 118 .
  • One or more of the modules of the system 100 A may be coupled to one another (e.g., through the example couplings shown in FIG. 1A ) or to modules not explicitly shown in FIG. 1A .
  • the computer-readable medium 102 may include any computer-readable medium, including without limitation a bus, a wired network, a wireless network, or some combination thereof.
  • the view modification user input engine(s) 104 may implement one or more automated agents configured to receive user input on the position and/or orientation of a displayed 3D model of a subject's teeth.
  • the system may coordinate (e.g., as part of a view coordination engine, not shown) the display of each of the 3D models concurrently being displayed so that changes in one model by the user are reflected in all or some of the other models, including changes in the position, etc.
  • the view modification engine(s) 104 may implement one or more automated agents configured to coordinate (in conjunction with the display output engine(s) 108 ) the 3D virtual representations of the patient's dental arches.
  • the view modification user input engine includes digital controls (e.g., digital buttons), such as the informative buttons (and may therefore interact with the informative button engine 116 ).
  • one or more buttons may include, for example, tooth numbering. Tooth numbering may be determined by the system or may be read as information about the tooth numbering in each 3D model (e.g., stored as part of the 3D model datastores).
  • a 3D model datastore e.g., untreated 3D model or treated 3D model datastore
  • the tooth type identifiers correspond to numbers of a Universal Tooth Numbering System, character strings to identify tooth types by anatomy, images or portions thereof to identify tooth types by geometry and/or other characteristics, etc.
  • the staging toolbar engine 106 coordinates the user selection of one or more stages of the treatment plan represented by one or more of the 3D digital models of the subject's teeth.
  • the stating toolbar engine may map selected staging buttons (e.g., buttons labeled numerically and/or alphanumerically with one or more treatment stage indicators) to the display of a corresponding stage in each of the 3D models (or in comparison to the untreated 3D model).
  • the display output engine 108 is configured to coordinate the display of 3D model(s) of the subject's teeth with each other and with changes made by the user (e.g., physician, doctor, dentist, dental technician, etc.).
  • the overcorrection stage engine 114 may determine and/or coordinate display of one or mover overcorrection stages, as will be described in greater detail below.
  • An informative button engine 116 may coordinate the use of one or more informative buttons that may be modify the information specific to each (or all) of the treatment plan 3D digital models and may process this information into a user-selectable button that both shows the status of the button (e.g., on/off) as well as information that is based on all or a subset of the treatment plans, including that all or some of the treatment plan includes one or more features (e.g., treatment features, such as interproximal reduction (IPR), attachments, hooks, tooth ramps, etc.).
  • IPR interproximal reduction
  • the system may include one or more datastores, including an untreated 3D (digital) model datastore 110 that may store the 3D model of the patient's untreated teeth, e.g., upper and/or lower arch.
  • the 3D model of the patient's untreated teeth may be imported (e.g., from an external file, remote server, etc.), scanned, e.g., using an intraoral scanner from a patient or a model of a patient's teeth and stored, or otherwise acquired by the system.
  • the system may include a treated 3D (digital) model datastore 112 for storing one or more 3D models of the patient's teeth during each stage of a treatment plan, including the final stage.
  • the datastore may also store information specific to the treatment plan, including features used to achieve tooth movement (including location on the teeth, etc.), number of stages, etc., or any other meta-information related to the referenced treatment plan.
  • the treated 3D (digital) models may be generated by the system or a separate system and imported/entered into the datastore. Any number of treatment 3D models (1, 2, 3, 4, 5, 6, 7, 8, 9, 10, etc.) may be stored and used, including selection by a user of which ones to show or display.
  • FIG. 1B shows a schematic example of another type of occlusal contact engine 118 .
  • the occlusal contact engine may be invoked by a user command (e.g., selection of a user input such as a digital button, switch, etc.).
  • the occlusal contact engine may include an occlusal view display engine 120 that may receive an instruction to switch between a current view of one or more digital model(s) (e.g., a plurality of concurrently displayed digital models) into a view showing occlusal surfaces of one or both upper and/or lower arch surfaces.
  • the system may translate the current 3D model display(s) into occlusal views, with the upper and lower arch, when both are shown concurrently, arranged with the upper arch above the lower arch, and both spread essentially flat.
  • the occlusal contact engine may also include a collision contact engine 122 for calculating (or receiving from an outside source that has already pre-calculated) the occlusal contact between the teeth of the upper and lower jaws.
  • the collision contact engine may estimate, from the 3D models, the normal intercuspation for each of the upper and lower jaws, and may determine where the intercuspation results in collision or contact between the teeth of the upper jaw and the teeth of the lower jaw. Both location and severity of collision may be estimated by the collision contact engine.
  • a collision scoring engine 124 may score the extent of the collision between the teeth of the upper and lower arch during intercuspation. The score may be qualitative and/or quantitative.
  • the scoring engine may apply a threshold (e.g., from the collision threshold engine 128 ) to determine if a collision is mild, extreme, etc.
  • the scoring engine may apply a threshold based on, e.g., a patient set threshold.
  • the collision threshold engine 128 may present a control on the display that the user may adjust to set or change the threshold (this may be reflected dynamically in real time in the display of the occlusal collisions).
  • the occlusal contact engine may also include a collision display engine 126 that coordinates the display of the determined collisions onto the 3D model(s) of the patient's teeth.
  • the occlusion(s) may be graphically illustrated on all of the treatment stages or in just the last (e.g., final) treatment stage(s).
  • Any of these systems may also include a modification engine (not shown) configured to receive user modification of one or more treatment plans, which may be used to submit for the generation of new treatment plan(s). Any of these system may also include a final approval and fabrication engine(s), not shown.
  • An aligner fabrication engine(s) may implement one or more automated agents configured to fabricate an aligner. Examples of an aligner are described in detail in U.S. Pat. No. 5,975,893, and in published PCT application WO 98/58596, which is herein incorporated by reference for all purposes. Systems of dental appliances employing technology described in U.S. Pat. No. 5,975,893 are commercially available from Align Technology, Inc., Santa Clara, Calif., under the tradename, Invisalign System.
  • aligner refers to the use of the terms “orthodontic aligner”, “aligner”, or “dental aligner” in terms of dental applications.
  • the aligner fabrication engine(s) 108 may be part of 3D printing systems, thermoforming systems, or some combination thereof.
  • FIGS. 2B and 2C illustrate a tool button 106 , shown in the upper left corner of the display in this embodiment, for toggling the display between dual view and single view.
  • the dual view button 206 is not selected and the stage 10 button is selected in the stage selection toolbar 204 , which results in stage 10 of the treatment plan being shown in the display in single view mode.
  • the dual view button 206 has been selected along with stage 10 in the stage selection toolbar 204 to show the initial malocclusion on the left hand side and stage 10 and the right side of the display.
  • FIG. 3 illustrates dual view combined with an occlusal view that can be selected by toggling an occlusal view button 300 .
  • the dual view button 206 and the occlusal view button 300 have been selected along with the button for stage 10 in the stage selection toolbar 204 . This results in the occlusal contacts of the dentition in the initial malocclusion to be shown and compared with the occlusal contacts of stage 10 .
  • FIG. 4 illustrates that in the dual view mode, the doctor is able to visualize or hide attachments 402 and other aligner features, IPR/space information 404 for the teeth contacts, and pontics using Attach, IPR and Pontic buttons 400 on the toolbar in order to analyze how these features are used for the particular treatment and what these features help to treat and whether addition, removal, or adjustment of any of these features help with the treatment plan.
  • Dual view in this case helps the doctor visualize and analyze this information in comparison to the initial malocclusion view:
  • the doctor is able to switch the right 3D model to any stage of the treatment using staging toolbar 204 at the bottom of the window in order to visualize and compare treatment details of the desired stage to the initial malocclusion (i.e., initial teeth position).
  • stage 10 is selected by selecting the # 10 button on the staging toolbar 204 .
  • multiple treatment plans can be shown in one screen.
  • the doctor can also select one of the plans to obtain further details of the plan and switch to a single plan view.
  • the various features used or not used in the various treatment plans such as attachments, IPR/spaces, pontics and/or information about occlusal contacts, can be shown simultaneously in the multiple view mode as shown in FIGS. 5A-5H .
  • Tools can indicate to the doctor whether a particular feature is present or absent in the plan/plans shown in the screen. Even if the doctor switches some features visualization OFF, he/she will clearly understand by indicator's color whether this feature is used in the treatment or absent. For example, if a feature is used in the treatment, the indicator for that feature (e.g., a circle, square, triangle, or other shaped object on the button) can be colored and filled in, and when the feature is not used, the indicator can be empty and uncolored.
  • the indicator for that feature e.g., a circle, square, triangle, or other shaped object on the button
  • the state of the tool buttons for the various features allows the doctor to quickly determine whether that feature is present or absent in a particular treatment plan. For example, if in the previous search some features were absent (indicator is empty or uncolored and for example tool is switched OFF), but in the new search this feature is present (the tool button still is switched OFF but the indicator shows that the feature is present now because the indicator becomes colored), the doctor will see a changed indicator and will understand if he needs to review details regarding the changed feature, and the related feature tool may be switched ON to visualize this information on the 3D model.
  • the attachment filter 508 is set to “yes” which means that attachments are used in the treatment plans, and the attach tool/button 500 and the IPR tool/button 502 is switched ON, which means that information for those features (i.e., attachments 501 and values for interproximal reductions 503 ) are shown on the 3D models for the treatment plans.
  • the occlus tool/button 504 and pontic tool/button 506 are switched OFF, meaning that information for those features are hidden and not displayed on the 3D models.
  • the attach tool/button 500 and the IPR tool/button 502 have filled/colored indicators (shown on the button as a filled circle) which means that the treatment plans shown in these views have these features used in the treatment.
  • the occlus tool/button 504 and the pontic tool/button 506 have an empty indicator (shown on the button as an empty circle) which means that these features are absent in the shown treatment plans.
  • the attachment filter 508 has been set to “no” in both treatment plans, which means that attachments aren't used in the treatment plans. Consequently, although the attach tool/button 500 is still switched ON, the attach tool/button 500 indicator is empty, which means that attachments are not used in either plan, which is confirmed by visualization of the 3D models which do not show attachment object anymore.
  • FIGS. 5C and 5D illustrate how a particular feature can be hidden or removed from the 3D model even though the feature is present in the treatment plan. This can be useful for simplifying the view of the 3D model when focusing on another feature.
  • the attach button 500 is selected “ON” (indicated by the colored icon on the button) and the attach button indicator is filled which means that attachments 501 are present in the treatment plan and viewable on the 3D model since the attach button 500 is “ON”.
  • the attach button 500 can be switched “OFF” by clicking the button to toggle the button from one state to another.
  • the attach button indicator is still filled and colored, which means that the attachment features are present in the treatment plans even if their visualization in the 3D models is switched OFF.
  • other features can be hidden as well by toggling the feature button to an OFF state.
  • FIGS. 5E-5G illustrate how filters can be used to select treatment plans using or not using various features, such as attachments and IPR.
  • the use of a feature in a treatment plan can be quickly determined by looking at the feature indicators (filled indicator means feature used and empty indicator means feature not used).
  • the attachment filter 508 is be set to “No”, which causes the attach button 500 indicator to be empty, which means attachments are not being used in the treatment plans.
  • the attachment filter 508 is changed to “Yes”, which causes the attach button 500 indicator to be filled, meaning attachments are used in the treatment plan but are not shown in the model because the attach button 500 is deselected.
  • the attachment filter 508 is changed back to “No”, which causes the attach button 500 indicator to revert back to being empty as in FIG. 5E , meaning attachments are not used in the treatment plans.
  • tool button states are not changed when searching/filtering for specific treatment plans. Therefore, if a tool button is in an “OFF” state, it will keep that “OFF” state for the new searched/filtered plans. Similarly, if the tool button is in an “ON” state, it will keep the “ON” state for the new searched/filter plans. However, the indicators for those buttons will change between filled and empty to indicate whether the feature is present or absent from the new searched/filtered treatment plans.
  • one or more feature button can have an indicator with more than 2 states, such as 3 states.
  • FIGS. 6A-6C illustrate the 3 states of the occlus button 600 .
  • the occlus button 600 has an empty indicator which means that there are not any normal occlusal contacts or heavy interarch collisions.
  • the occlus button 600 has a green indicator which means that the treatment plans shown in the screen have only normal occlusal contacts.
  • the occlus button 600 has a red indicator which means that the treatment plans have heavy inter-arch collisions.
  • the selected treatment plan will be opened in the single treatment plan (STP) view.
  • Toolbar buttons state switching ON or switched OFF
  • indicators can be changed to show actual information about features availability for this particular plan.
  • the toolbar showed a green indicator for attach button 702 because one of the plans has attachments 703 .
  • the attach button 702 won't change its state, but its indicator shall become empty because the selected plan does not have this feature used in the treatment.
  • the doctor can visualize or hide overcorrection stages in the treatment using a tool/button placed next to or integrated into the end of the staging toolbar 800 that represents the treatment stages.
  • the overcorrection stages are hidden and a “+” button 802 can be clicked or selected to unhide and show the overcorrection stages in the staging toolbar 800 .
  • the overcorrection stages 804 are visible in the staging toolbar 800 as blue lines (although other colors or patterns can be used to distinguish the overcorrection stages from the other stages).
  • the doctor can review the details of the overcorrection stages 804 or the normal stages by selecting the stage using the staging toolbar 800 .
  • An “X” button 806 can be clicked or selected to hide the overcorrection stages.
  • an overcorrection technique is used for example for virtual c-chain with aligners which simulates the effect of using elastic c-chains in bracket and wire treatments.
  • a treatment form can be displayed on the screen as shown in FIG. 9 to the doctor as a prompt or reminder for asking about and/or using overcorrection stages.
  • overcorrection stages are used to provide additional (extra) forces for specific tooth movements.
  • teeth continue moving in the same direction as originally planned. They are requested for some particular teeth movements in order to achieve ideal treatment results.
  • the system can display on a screen an occlusal view of the subject's dentition in an open mouth configuration where the upper arch 1000 is shown on the top of the screen and the lower arch 1002 is shown at the bottom of the screen, and this view can be combined and overlaid with the inter-arch contacts 1004 visualization which are shown in the figure as colored (e.g., green for normal occlusal contacts and red for heavy inter-arch collisions) areas on the teeth.
  • the doctor can clearly see teeth alignment for both arches in the initial teeth position, shown in FIG. 10 on the left, in final teeth position and in any stage of the treatment, shown in FIG. 10 in the middle and on the right.
  • the doctor also can visualize and investigate inter-arch contacts on both arches simultaneously.
  • Occlusal contacts 1004 visualization on the initial teeth position are very important and may be critical for doctors to verify that the bite has been set correctly. After this feature implementation the doctors have a special tool to check that the initial bite setup is correct based on the pattern of occlusal contacts.
  • Occlusal contacts visualization on the final teeth position gives the doctor an understanding of whether the treatment will be efficacious and whether the subject will have or not have heavy inter-arch contact collisions after the treatment. Based on this information the system or doctor can decide whether the treatment requires modifications to fix such problems or whether the treatment is satisfactory and can be continued.
  • the doctor is also able to visualize and compare under the occlusal views the inter-arch contacts for the initial malocclusion and for the final (or intermediate) teeth positions for two different treatment plans that are selected for comparison.
  • the combination of the occlusal view with the inter-arch contacts visualization gives the doctor the ability to check both arches teeth alignment and analyze inter-arch contacts information simultaneously.
  • Such views allow the doctor to check inter-arch contacts in the initial teeth position to check if the initial bite setup was done correctly or not.
  • Such views allow the doctor to check inter-arch contacts in the final teeth position in order to make sure that the treatment is proper and efficacious and does not have any major issues that would prohibit going forward with the treatment plan. Otherwise, if for example heavy inter-arch contacts are present, the doctor is able to modify the treatment plan to fix and eliminate such issues from the approved treatment plan.
  • the doctor In the multiple treatment plans view the doctor is able to view and compare the occlusal view and inter-arch contacts in initial teeth position and in the final teeth position for two different treatment plans that are selected for comparison.
  • the doctor In the single treatment plan view the doctor is also able to view inter-arch contacts on any stage of the treatment. This additional information gives the doctor a fuller understanding of what will happen with the inter-arch contacts during the course of the treatment and whether the treatment plan should be modified or corrected in the middle of the treatment or before beginning treatment.
  • the system can warn the doctor about the heavy inter-arch collisions.
  • a tool which is used to switch occlusal view with inter-arch contacts visualization can have a special red indicator, notifying the doctor about heavy collisions; or it can have a green indicator if the case has only normal occlusal contacts.
  • FIG. 11A shows a closed mouth view with the occlus button 1100 not selected.
  • a filled green indicator on the occlus button indicates that there are normal occlusal contacts through the treatment and that no heavy inter-arch collisions are present during the course of the treatment.
  • Clicking or otherwise selecting the occlus button 1100 changes the screen to the occlus view as shown in FIG. 11B with all 3D models changed to the occlusal view with the occlusal contacts visualization enabled.
  • the doctor is able to compare teeth alignment for both arches for different treatment plans selected in MTP view and compare it with the initial teeth position shown on the initial malocclusion 3D model.
  • the 3D models can be simultaneously rotated by selecting the rotate button 1200 .
  • Rotating the 3D models allows the doctor to view and analyze any side of the 3D model and check inter-arch contacts in detail if needed.
  • Rotating one 3D model will rotate the others in the MTP view simultaneously, with all 3D models presented at the same angle and perspective. This allows the doctor to check differences in all the selected treatment plans and the initial malocclusion right away in one screen by comparing visualized information from any side on all 3D models shown in that view.
  • FIGS. 13A-13C when the doctor is in a single treatment plan (STP) view (when only one treatment plan is available for review with additional details like staging), the treatment stages are viewable via the staging toolbar 1300 and the occlus button 1302 can be used to check/review/analyze inter-arch collisions/contacts 1304 on any stage of the treatment.
  • FIG. 13A illustrates a STP view of the initial teeth position with the occlusal view switched on.
  • FIG. 13B illustrates a STP view of the final teeth position with the occlusal view switched on.
  • FIG. 13C illustrates a STP view of a middle stage of the treatment with the occlusal view switched on.
  • the occlusal contacts are not shown in any of the middle stages.
  • the occlusal contacts 1304 are also shown in the middle stages.
  • the doctor by clicking or selecting the dual view button 1400 in the STP view the doctor also is able to switch to dual view where two 3D models are visualized simultaneously: the left one with the 3D model for initial malocclusion; the right one with the treatment plan and stage selected.
  • the doctor can compare teeth alignment before and after the treatment and see how inter-arch contacts are changed.
  • the heavy inter-arch collisions 1406 are shown in red and the normal occlusal contacts 1408 are shown in green.
  • the doctor is also able to switch the right 3D model to any stage of the treatment by selecting the desired stage using the stage toolbar 1401 .
  • the doctor can also compare details for the teeth alignment and inter-arch contacts for initial teeth position and any stage of the treatment.
  • the stage toolbar 1401 has been used to select stage 10 1403 with the occlusal view switched on.
  • the inter-arch contacts are not shown in the middle stages, while in other embodiments, the inter-arch contacts are shown for the middle stages.
  • FIGS. 1C and 1D illustrate examples of methods, e.g., methods of treatment plan review and/or modification as described herein and illustrated above.
  • the method 130 may include initially receiving or generating a 3D digital model of the patient's teeth in an initial (untreated) confirmation, and one or more 3D digital models of the patient's teeth following a treatment plan (or plans).
  • the digital models may be based on a digital scan of the patient's dentition (e.g., from an intraoral scan and/or a scan of a dental impression).
  • the method may also include displaying a staging toolbar on a first portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of three dimensional (3D) digital models of a subject's dentition, wherein the plurality of 3D models includes a first model that shows an arrangement of a subject's teeth before receiving a treatment, one or more intermediate models that show an arrangement of the subject's teeth during a stage of the treatment, and a final model that shows an arrangement of the subject's teeth after receiving the treatment 132 .
  • the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of three dimensional (3D) digital models of a subject's dentition
  • the plurality of 3D models includes a first model that shows an arrangement of a subject's teeth before receiving a treatment, one or more intermediate models that show an arrangement of the subject's teeth during a stage of the treatment, and a final model that shows an arrangement of the subject's teeth after receiving the treatment 132
  • the method may include displaying a displayed 3D model corresponding to one of the plurality of 3D models 134 , and changing the displayed 3D model to correspond to whichever digital button is selected by a user 136 (e.g., adjusting a view of the displayed 3D model shown on the display based on a user input, wherein the view of the displayed 3D model is applied to the changed displayed 3D model as the user selects the digital buttons).
  • the method may include displaying a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate the presence of the treatment feature in the first orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model 138 .
  • the method may also include receiving any modifications to the treatment plan from the user 139 . These modifications may then be used to generate a new or modified treatment plan.
  • FIG. 1D illustrates another example of a method of treatment plan review and/or modification.
  • the method includes indicating occlusal contact severity.
  • the method includes displaying, on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, and a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment 142 .
  • the method may also include retrieving, receiving or otherwise generating the 3D digital model(s).
  • the method may further include displaying the second 3D model at either a final stage or an intermediate treatment stage of the first orthodontic treatment 144 , and adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model 146 .
  • the method may include receiving the collisions between the upper and lower arches in each of the untreated and one or more treated 3D models; optionally the method may include calculating collisions between upper and lower arches of the treated and/or untreated 3D models 148 .
  • the one or more regions of inter-arch collisions may be indicated on either or both of the first 3D model and the second 3D model 152 . This may be indicated in color, by label, etc., as described above.
  • the method or system may then switch the view of both the first 3D model and the second 3D model to an occlusal view (e.g., when the user selects a control, wherein the occlusal view shows occlusal surfaces of teeth in the first 3D model and the second 3D model) 150 .
  • the system may then optionally receive, from the user, modifications to one or more treatment plans, such as further treatment instructions 152 .
  • references to a structure or feature that is disposed “adjacent” another feature may have portions that overlap or underlie the adjacent feature.
  • spatially relative terms such as “under”, “below”, “lower”, “over”, “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is inverted, elements described as “under” or “beneath” other elements or features would then be oriented “over” the other elements or features. Thus, the exemplary term “under” can encompass both an orientation of over and under.
  • the device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
  • the terms “upwardly”, “downwardly”, “vertical”, “horizontal” and the like are used herein for the purpose of explanation only unless specifically indicated otherwise.
  • first and second may be used herein to describe various features/elements (including steps), these features/elements should not be limited by these terms, unless the context indicates otherwise. These terms may be used to distinguish one feature/element from another feature/element. Thus, a first feature/element discussed below could be termed a second feature/element, and similarly, a second feature/element discussed below could be termed a first feature/element without departing from the teachings of the present invention.
  • a numeric value may have a value that is +/ ⁇ 0.1% of the stated value (or range of values), +/ ⁇ 1% of the stated value (or range of values), +/ ⁇ 2% of the stated value (or range of values), +/ ⁇ 5% of the stated value (or range of values), +/ ⁇ 10% of the stated value (or range of values), etc.
  • Any numerical values given herein should also be understood to include about or approximately that value, unless the context indicates otherwise. For example, if the value “10” is disclosed, then “about 10” is also disclosed. Any numerical range recited herein is intended to include all sub-ranges subsumed therein.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Epidemiology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)

Abstract

Orthodontic devices such as aligners, palatal expanders, retainers, and dental implants can be used to adjust the position of teeth and to treat various dental irregularities. To help the clinician or doctor (i.e., orthodontist) design and plan the subject's treatment plan, a 3D digital model of the subject's teeth, dentition, and gingiva can be constructed from a 3D scan of the subject's mouth, teeth, dentition, and gingiva. The 3D model of the subject's teeth and dentition can be displayed graphically to the doctor on a display using a computing system with memory and software.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
This patent application claims priority to U.S. provisional patent application No. 62/692,538, titled “VISUALIZATION OF TEETH” and filed on Jun. 29, 2018, herein incorporated by reference in its entirety.
This application may be related to U.S. patent application Ser. No. 16/178,491, titled “AUTOMATIC TREATMENT PLANNING,” filed on Nov. 1, 2018 and herein incorporated by reference in its entirety.
INCORPORATION BY REFERENCE
All publications and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication or patent application was specifically and individually indicated to be incorporated by reference.
FIELD
Embodiments of the invention relate generally to systems and method for the visualization of teeth.
BACKGROUND
Orthodontic devices such as aligners, palatal expanders, retainers, and dental implants can be used to adjust the position of teeth and to treat various dental irregularities. To help the clinician or doctor (i.e., orthodontist) design and plan the subject's treatment plan, a three-dimensional (3D) digital model of the subject's teeth, dentition, and gingiva can be constructed from a 3D scan of the subject's mouth, teeth, dentition, and gingiva. The 3D model of the subject's teeth and dentition can be displayed graphically to the doctor on a display using a computing system with memory and software.
It would be desirable to provide the doctor with the ability to easily visualize and compare various 3D models of the subject's teeth at different stages of the treatment along with the effect of various features that can be used during the treatment.
SUMMARY OF THE DISCLOSURE
The methods and apparatuses (e.g., systems, devices, etc.) described herein may relate to orthodontic treatment planning, including the visualization of teeth for modifying, enhancing and improving treatment plans. In particular, described herein are methods and apparatuses for reviewing, analyzing and/or modifying orthodontic treatment plans. These methods may include one or more user interfaces that are configured to improve review and modification of orthodontic treatment planning.
For example a method may include: displaying a staging toolbar on a first portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of three dimensional (3D) digital models of a subject's dentition, wherein the plurality of 3D models includes a first model that shows an arrangement of a subject's teeth before receiving a treatment, one or more intermediate models that show an arrangement of the subject's teeth during a stage of the treatment, and a final model that shows an arrangement of the subject's teeth after receiving the treatment; displaying a displayed 3D model corresponding to one of the plurality of 3D models; changing the displayed 3D model to correspond to whichever digital button is selected by a user; adjusting a view of the displayed 3D model shown on the display based on a user input, wherein the view of the displayed 3D model is applied to the changed displayed 3D model as the user selects the digital buttons.
The staging toolbar may be a virtual toolbar including the plurality of digital buttons, which may be arranged within the display (screen, touchscreen, virtual display, augmented reality display, etc.). For example, the staging toolbar may be an arrangement of virtual buttons on the top, side(s) and/or bottom of the display that may be selected by a user, e.g., by clicking on them or otherwise selecting them.
The digital model of the subject's dentition may include a 3D surface (or in some variations surface and volumetric) model of the subject's upper and/or lower arch, including teeth and in some variations gingiva (e.g., particularly the portion of gingiva around the teeth). The 3D model may be segmented into individual teeth that may be separately selected and/or moved by the user or system. The system or method may store user inputs and/or generate user output, e.g., modifications to the display, based on user selections from the controls and the processing by the system.
At least some of the digital buttons may correspond to overcorrection stages. The digital buttons that correspond to the overcorrection stages may be hidden or revealed by user-controlled switch (e.g., a virtual button on the display that allows the user to toggle between showing and hiding the overcorrection stages), and/or selecting the overcorrecting stages as part of an actual treatment plan.
The user input may include adjustments to the display of the 3D model, including one or more of: a rotation, a zoom, or a pan of the displayed 3D model. Additional tools may include showing the surface of the 3D model, showing a wireframe of the 3D model, changing the color of the 3D model, etc.
The user input may include selecting from a set of preset views, such as showing the 3D model of the upper and/or lower jaw in a frontal view, a left side view, a right side view, a back view, etc.
The user input may include selecting to display or hide on the view of the teeth of the 3D model one or more of: tooth numbering, attachments, interproximal reduction spacing, and pontics. These display options may be separately controlled, e.g., by including one or more virtual controls (e.g., buttons, switches, etc.) that allows the selection of each of these features individually or collectively. The system may include processing to determine or suggest one or more of these features (e.g., determining automatically or semi-automatically tooth numbering, position, number and/or orientation of attachments, hooks, ramps, etc.
Changing the displayed 3D model to correspond to whichever digital button is selected by a user may include calculating the viewing angle and magnification from a current displayed 3D model and applying the calculated viewing angle and magnification to a new 3D model from the plurality of 3D models corresponding to the digital button selected by the user.
Also described herein are systems configured to perform any of the methods described herein. These systems may include one or more processors and may include a memory coupled to the one or more processors configured to store computer-program instructions that, when executed by the one or more processors, perform the methods.
For example, A system (e.g., for assisting in treatment planning, for visualizing a subject's teeth, for reviewing and/or modifying a treatment plan) may include: one or more processors; a memory coupled to the one or more processors, the memory configured to store computer-program instructions, that, when executed by the one or more processors, perform a computer-implemented method comprising: displaying a staging toolbar on a first portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of three dimensional (3D) digital models of a subject's dentition, wherein the plurality of 3D models includes a first model that shows an arrangement of a subject's teeth before receiving a treatment, one or more intermediate models that show an arrangement of the subject's teeth during a stage of the treatment, and a final model that shows an arrangement of the subject's teeth after receiving the treatment; displaying a displayed 3D model corresponding to one of the plurality of 3D models; changing the displayed 3D model to correspond to whichever digital button is selected by a user; and adjusting a view of the displayed 3D model shown on the display based on a user input, wherein the view of the displayed 3D model is applied to the changed displayed 3D model as the user selects the digital buttons.
Any of the methods and apparatuses described herein may be configured to also or alternatively display multiple 3D models, including multiple treatment plans (each having the same or a different number of treatment stages), and/or display a 3D model (e.g., surface model) of an initial (unmodified) arrangement of the subject's teeth with one or more treatment plans (each having multiple treatment stages). The system and method may enhance review of the treatment plan(s) by allowing the user to make changes in the appearance (angle, zoom, pan, etc.), and/or selection of a displayed treatment stage when displaying multiple treatment plans, of one of the displayed 3D models and concurrently making the same (or similar) changes in the other treatment plans.
Any of these systems and methods may also address the problem of complexity associated with the display of one or more treatment plans, in which each treatment plan includes a large number of stages, and multiple potential ‘treatments’ at each stage, such as changes in the tooth position, angle, etc., as well as the components of the treatment applied or to be applied, such as interproximal reduction, extraction, ramps, attachments, hooks, etc. These components may be different at different stages of each treatment plan and may be widely different or similar between different treatment plans. The methods an apparatuses may provide simplified techniques for controlling the otherwise complicated and information-dense displays. For example, in some variations the methods and apparatuses may include informative controls that allow toggling of display options on or off, but may also include information about these features being displayed or features related to those being displayed.
For example, described herein are methods comprising: displaying, side-by-side on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, and a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment; wherein the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment; adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model; displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages of the first orthodontic treatment; and changing the displayed stage of the second 3D model to correspond to a stage selected by the user from the staging toolbar.
Any of these methods may also include displaying a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate the presence of the treatment feature in the first orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model.
The plurality of treatment features may include: tooth numbering, attachments, interproximal reduction spacing, and pontics. The user input may include one or more of: a rotation, a zoom, or a pan of the displayed 3D models. In some variations, the user input includes selecting from a set of preset views.
For example, a method may include: displaying, side-by-side on a display, a plurality of three dimensional (3D) models of a subject's dentition, wherein the plurality of 3D models includes two or more of: a first 3D model that shows an arrangement of the subject's teeth before receiving a treatment, a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment, and a third 3D model that shows an arrangement of the subject's teeth subject to a second orthodontic treatment; wherein when either or both the first 3D model and the second 3D model are displayed, the first 3D model and the second 3D model are displayed at either a final stage or an intermediate treatment stage; and adjusting a view of all of the displayed 3D models based on a user input modifying one of the plurality of 3D models.
Any of these methods (or apparatus performing them) may include displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages, and wherein when either or both the first 3D model and the second 3D model are displayed, changing the displayed stage to correspond to a stage selected by the user from the staging toolbar. The method or apparatus may include displaying a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate the presence of the treatment feature in the first orthodontic treatment or the second orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model. The plurality of treatment features may comprises: tooth numbering, attachments, interproximal reduction spacing, hooks, ramps, pontics, etc. The user input may include one or more of: a rotation, a zoom, or a pan of the displayed 3D models. The user input may include selecting from a set of preset views.
A system for visualizing a subject's teeth, the system comprising: one or more processors; a memory coupled to the one or more processors, the memory configured to store computer-program instructions, that, when executed by the one or more processors, perform a computer-implemented method comprising: displaying, side-by-side on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, and a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment; wherein the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment; adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model; displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages of the first orthodontic treatment; and changing the displayed stage of the second 3D model to correspond to a stage selected by the user from the staging toolbar.
A system for visualizing a subject's teeth, the system comprising: one or more processors; a memory coupled to the one or more processors, the memory configured to store computer-program instructions, that, when executed by the one or more processors, perform a computer-implemented method comprising: displaying, side-by-side on a display, a plurality of three dimensional (3D) models of a subject's dentition, wherein the plurality of 3D models includes two or more of: a first 3D model that shows an arrangement of the subject's teeth before receiving a treatment, a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment, and a third 3D model that shows an arrangement of the subject's teeth subject to a second orthodontic treatment; wherein when either or both the first 3D model and the second 3D model are displayed, the first 3D model and the second 3D model are displayed at either a final stage or an intermediate treatment stage; adjusting a view of all of the displayed 3D models based on a user input modifying one of the plurality of 3D models. These systems may be configured to include any of the method features described herein.
Also described herein are methods and apparatuses for reviewing, modifying, confirming and/or selecting a treatment plan that includes comparing occlusal collisions of teeth between one or more treatment plans and/or the untreated teeth. Any of the systems and methods above may include this feature, which may be a user-selectable control (e.g., button, etc.), such as a virtual button that switches a view of the 3D model(s), such as a frontal and/or side view of one or more dental arches (e.g., upper and/or lower arches) to an occlusal view showing the occlusal surfaces of the upper and/or lower dental arches. The occlusal surface may include indicator(s) of the severity of contact (collision) between the upper and lower jaw in normal intercuspation of the teeth. The severity of contact (collision) may be shown as a threshold, showing two states, “low” or “normal” contact and “high” or “severe” contact. Alternatively or additionally, contact/collision may be shown as a heat map indicating a scaled degree of contact/collision and/or an annotated indicator (numerical, alphanumeric, etc.) indicating the contact severity.
For example, a method may include: displaying, on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, and a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment; wherein the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment; adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model; switching the view of both the first 3D model and the second 3D model to an occlusal view when the user selects a control, wherein the occlusal view shows occlusal surfaces of teeth in the first 3D model and the second 3D model, further wherein the occlusal view indicates one or more regions of inter-arch collisions on either or both of the first 3D model and the second 3D model.
The occlusal view may indicate one or more regions of inter-arch collisions using an indicator that is scaled to differentiate a relative degree of contact between an upper arch and a lower arch. The indicator may be colored differently to differentiate regions of normal contact from regions of high contact (e.g., normal contact in green, high contact in red). The method or system may be configured to calculate the regions of inter-arch collision on the first 3D model and calculating regions of inter-arch collision on region on the second 3D model. The method or system may further set a threshold (or may apply a user-specified threshold) of contact degree to differentiate normal from high. For example, in some variations the user interface may include a dial or slider that allows selection of the degree of contact.
In some variations, the method may include displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages of the first orthodontic treatment; and changing the displayed stage of the second 3D model to correspond to a stage selected by the user from the staging toolbar.
In any of these methods and apparatuses, the different 3D models may be displayed side-by-side, in either the same or different windows. The method or system may include displaying an upper arch engaged with a lower arch of the first 3D model and displaying an upper arch engaged with a lower arch of the second 3D model. Any of these methods or apparatuses may be configured to display a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate the presence of the treatment feature in the first orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model. For example, the plurality of treatment features may comprise: tooth numbering, attachments, interproximal reduction spacing, hooks, ramps (e.g., bite ramps), pontics, etc.
Adjusting the view of both of the first 3D model and the second 3D model based on the user input may include modifying one or more of: a rotation, a zoom, or a pan of the first 3D model and the second 3D model. Thus, these viewing options may be concurrently adjusted (translating the user adjustments in the display parameters of one 3D model to the other 3D model, etc., typically in real time). In some variations, the user input may include selecting from a set of preset views.
For example, a method may include: displaying, on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, and a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment; wherein the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment; adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model, wherein adjusting the view comprises adjusting one or more of the rotation, zoom and pan; switching the view of both the first 3D model and the second 3D model to an occlusal view when the user selects a control, wherein the occlusal view shows occlusal surfaces of teeth in the first 3D model and the second 3D model, further wherein the occlusal view indicates one or more regions of inter-arch collisions on either or both of the first 3D model and the second 3D model, using an indicator that is scaled to differentiate a relative degree of contact between an upper arch and a lower arch.
In any of these methods and apparatuses, the indicator may be colored or marked differently to differentiate regions of normal contact from regions of high contact. The indicator maybe a region associated with the button (e.g., within a boundary of the button), such as a box, circle, dot, etc., on the button and/or a marking on the button, including the text used to indicate the primary function of the button (e.g., attachments, IPR, pontics, extraction(s), etc.).
Any of these methods or apparatuses may include calculating regions of inter-arch collision on the first 3D model and calculating regions of inter-arch collision on region on the second 3D model, as part of an inter-arch collision calculator module. Any of these methods or apparatuses may also or alternatively include displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages of the first orthodontic treatment; and changing the displayed stage of the second 3D model to correspond to a stage selected by the user from the staging toolbar.
As mentioned, any of the methods an apparatuses may include side-by-side, and/or displaying the upper arch engaged with the lower arch of the first 3D model and displaying the upper arch engaged with the lower arch of the second 3D model (e.g., showing the upper and lower arch intercuspating).
Any of these methods and apparatuses may include displaying a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate the presence of the treatment feature in the first orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model. As mentioned above, the plurality of treatment features may include: tooth numbering, attachments, interproximal reduction spacing, ramps, hooks, pontics, etc. Adjusting the view of both of the first 3D model and the second 3D model based on the user input may include modifying one or more of: a rotation, a zoom, or a pan of the first 3D model and the second 3D model. The user input may include selecting from a set of preset views.
Also described herein are systems for visualizing a subject's teeth that may include: one or more processors; a memory coupled to the one or more processors, the memory configured to store computer-program instructions, that, when executed by the one or more processors, perform a computer-implemented method comprising: displaying, on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, and a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment; wherein the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment; adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model; switching the view of both the first 3D model and the second 3D model to an occlusal view when the user selects a control, wherein the occlusal view shows occlusal surfaces of teeth in the first 3D model and the second 3D model, further wherein the occlusal view indicates one or more regions of inter-arch collisions on either or both of the first 3D model and the second 3D model. As mentioned above, any of these systems may be further configured to perform any of these method steps described above (e.g., by include one or more software modules or components for performing them.
Also described herein are systems for visualizing a subject's teeth that include: one or more processors; a memory coupled to the one or more processors, the memory configured to store computer-program instructions, that, when executed by the one or more processors, perform a computer-implemented method comprising: displaying a staging toolbar on a first portion of a display, wherein the staging toolbar comprises one or more digital buttons that corresponding to a plurality of three dimensional models, wherein the plurality of three dimensional models include includes a first model that shows the subject's teeth position before receiving treatment, one or more intermediate models that show the subject's teeth position during treatment, and a final model that shows the subject's teeth position after receiving treatment; presenting a multiple view digital button on a second portion of the display; and presenting on the display both the first model and a second model when the multiple view digital button is selected by a user, wherein the second model is selected from one of the intermediate models or the final model. The second model may be determined by a selection of one of the digital button of the staging toolbar. The computer-implemented method may further comprise: presenting a feature button on the display; and presenting, on the display, the feature on the three dimensional model of the subject's teeth. For example, the feature button may be an attachment button, wherein the computer-implemented method further comprises presenting, on the display, attachments on the subject's teeth when the attachment button is selected. The feature button may be a pontics button, wherein the computer-implemented method further comprises presenting on the display, pontics on the subject's teeth when the pontics button is selected. The feature button may be an interproximal reduction and space management button, wherein the computer-implemented method further comprises presenting, on the display, interproximal reduction and space management data on the subject's teeth when the interproximal reduction and space management button is selected.
Any of these methods may also include: presenting an on state for the feature button when the feature button is selected by the user; presenting an off state for the feature button when the feature button is not selected by the user, wherein the on state is visually distinguishable from the off state; and presenting the feature on the three dimensional model when the feature button is in the on state; not presenting the feature on the three dimensional model when the feature button is in the off state; and presenting an indicator associated with the feature button that indicates whether the feature is present or absent from the treatment.
As mentioned above, the staging toolbar may include one or more hidden overcorrection stages, wherein the computer-implemented method further comprises presenting the one or more hidden overcorrection stages when the user clicks or selects a button next to the staging toolbar or integrated into one end of the staging toolbar. The feature button may be an occlusal button, wherein the computer-implemented method further comprises presenting, on the display, occlusal contacts on the subject's teeth when the occlusal button is selected. For example, the occlusal contacts may comprise normal occlusal contacts that are shown in a first color and heavy inter-arch collisions that are shown in a second color.
Any of these methods may include: storing in a memory, a plurality of three dimensional models of the subject's teeth, wherein the plurality of three dimensional models includes a first model that shows the subject's teeth position before receiving treatment, one or more intermediate models that show the subject's teeth position during treatment, and a final model that shows the subject's teeth position after receiving treatment. The method may further include displaying, using the processor, a staging toolbar on a first portion of the display, wherein the staging toolbar comprises one or more digital buttons that corresponding to the plurality of three dimensional models; displaying, using the processor, a multiple view digital button on a second portion of the display; and displaying, using the processor, on the display both the first model and a second model when the multiple view digital button is selected by a user, wherein the second model is selected from one of the intermediate models or the final model. Alternatively or additionally, the method may include: displaying, using the processor, a feature button on the display; and displaying, using the processor, on the display the feature on the three dimensional model of the subject's teeth.
As mentioned, the feature button may be an attachment button, wherein the method may further comprise displaying, using the processor, on the display attachments on the subject's teeth when the attachment button is selected. The feature button may be a pontics button, wherein the method may further comprise displaying, using the processor, on the display pontics on the subject's teeth when the pontics button is selected. The feature button may be an interproximal reduction and space management button, wherein the method may further comprise displaying, using the processor, interproximal reduction and space management data on the subject's teeth when the interproximal reduction and space management button is selected.
The method may include displaying, using a processor, an on state for the feature button when the feature button is selected by the user; displaying, using a processor, an off state for the feature button when the feature button is not selected by the user, wherein the on state is visually distinguishable from the off state; and displaying, using a processor, the feature on the three dimensional model when the feature button is in the on state; not displaying, using a processor, the feature on the three dimensional model when the feature button is in the off state; and displaying, using a processor, an indicator associated with the feature button that indicates whether the feature is present or absent from the treatment. The staging toolbar may comprise one or more hidden overcorrection stages, wherein method further comprises displaying the one or more hidden overcorrection stages when the user clicks or selects a button next to the staging toolbar or integrated into one end of the staging toolbar. The feature button may be an occlusal button, wherein the method further comprises displaying on the display occlusal contacts on the subject's teeth when the occlusal button is selected.
For example, a method of visualizing a subject's teeth on a display with a processor may include: storing in a memory, a plurality of three dimensional models of the subject's teeth, wherein the plurality of three dimensional models includes a first model that shows the subject's teeth position before receiving treatment, one or more intermediate models that show the subject's teeth position during treatment, and a final model that shows the subject's teeth position after receiving treatment; displaying, using the processor, a staging toolbar on a first portion of the display, wherein the staging toolbar comprises one or more digital buttons that corresponding to the plurality of three dimensional models; displaying, using the processor, a multiple view digital button on a second portion of the display; and displaying, using the processor, on the display both the first model and a second model when the multiple view digital button is selected by a user, wherein the second model is selected from one of the intermediate models or the final model.
BRIEF DESCRIPTION OF THE DRAWINGS
The novel features of the invention are set forth with particularity in the claims that follow. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:
FIG. 1A is a diagram illustrating one example of a treatment plan review and/or modification system as described herein.
FIG. 1B is a diagram illustrating one example of an occlusal contact engine as described herein.
FIG. 1C schematically illustrates one example of a method of treatment plan review and/or modification as described herein.
FIG. 1D schematically illustrates one example of a method of treatment plan review and/or modification including occlusal contact severity.
FIGS. 2A-2C illustrate dual and single view display modes for viewing 3D models of a subject's teeth at various stages of a treatment plan.
FIG. 3 illustrates a dual view that can be combined with an occlusal view.
FIG. 4 illustrates the ability to hide and visualize features in dual view.
FIG. 5A-5G illustrate viewing multiple treatment plans and various features that can be used during treatment.
FIGS. 6A-6C illustrate an occlusal view button with three indicator states.
FIGS. 7A and 7B illustrate how to switch between a view of multiple treatment plans and a view of a single treatment plan.
FIGS. 8A and 8B illustrate overcorrection stages that can be hidden and unhidden.
FIG. 9 illustrates a treatment form that can be used to prompt the doctor about overcorrection stages.
FIG. 10 illustrates an embodiment of a multiple treatment plan view with occlusal view switched on to display occlusal contacts.
FIGS. 11A and 11B illustrate that the multiple treatment plan view can be toggled between a closed mouth view and an open mouth occlusal view.
FIG. 12 illustrates that rotation of a 3D model in the multiple treatment plan view simultaneously rotates the other 3D models such that all the models are presented at the same viewing angle and perspective.
FIGS. 13A-13C illustrate various single treatment plan views with the occlusal view switched on.
FIGS. 14A and 14B illustrate various dual views with the occlusal view switched on.
DETAILED DESCRIPTION
Orthodontic devices such as aligners, palatal expanders, retainers, and dental implants can be used to adjust the position of teeth and to treat various dental irregularities. To help the clinician or doctor (i.e., orthodontist) design and plan the subject's treatment plan, a 3D digital model of the subject's teeth, dentition, and gingiva can be constructed from a 3D scan of the subject's mouth, teeth, dentition, and gingiva. The 3D model of the subject's teeth and dentition can be displayed graphically to the doctor on a display using a computing system with memory and software. Input devices such as a mouse and/or keyboard allows the doctor to manipulate the 3D model. The systems and methods described herein are particularly well suited to be used in procedures involving aligners, but the systems and methods are also suitable for use with staging other types of orthodontic devices.
For example, the 3D model of the can be rotated in any axis and can be zoomed into and out as desired. Each individual tooth can be a separate object in the 3D model that can be manipulated by the doctor. From the initial teeth position, the doctor can manipulate the teeth using the input devices into a desired final teeth position. The computer system can then determine the appropriate intermediate stages that can be used to move the teeth from the initial teeth position to the final teeth position. The initial, final, and intermediate teeth position stages can be displayed to the doctor.
One or more graphical toolbars can be displayed to the doctor to facilitate making adjustments to the teeth and arch. The toolbars can have buttons to perform various actions to the 3D model. For example, one toolbar can have buttons that allow the doctor to manipulate the viewing angle and perspective of the 3D model.
Another toolbar can have buttons for making various tooth adjustments, such as extrusion/intrusion, bucco-lingual translation, mesio-distal translation, rotation, crown angulation, bucco-lingual root torque, bucco-lingual crown tip, and mesio-distal crown tip. In some embodiments, when one tooth is adjusted, some or all the other teeth in the same arch will automatically adjust in response. In some embodiments, the doctor can lock and keep a particular tooth at a desired position, and designate a tooth as unmovable for the duration of a treatment (e.g. crowns, implants).
Another toolbar can have buttons for making attachments and precision cuts. These buttons allow the doctor to add conventional attachments and precision cuts by simply dragging and dropping the attachment or cut to the tooth of choice, and the doctor can easily remove attachments and precision cuts by dragging them to the trash can. The buttons also allow the doctor to adjust the placement and rotate conventional attachments, and change the size, prominence and degree of beveling of rectangular attachments. In addition, the button allow the doctor to fine-tune the mesiodistal position of button cutouts.
Another toolbar can have buttons for posterior arch expansion and contraction. This toolbar allows the doctor to expand or contract posterior arches by expanding or contracting the upper arch only, the lower arch only, or both arches. As above, when an arch modification is made on the 3D model, some or all other teeth in the adjusted arch will automatically adjust in response.
Another toolbar can have buttons for interproximal reduction (IPR) and space management. With the IPR and spacing toolbar, the doctor can choose to (1) select the auto adjust option: IPR and space automatically adjusts as you make adjustments on the 3D model; (2) select the keep current option: to preserve the current IPR configuration; (3) select the no IPR option: all existing IPR will be removed, and no IPR will be added; and (4) manually adjust IPR and space on the 3D model (add, remove or lock for specific teeth).
Additional features that can be include in a toolbar include occlusal contacts, which identifies and displays to the doctor all or a subset of inter-arch occlusal contacts, and resolves heavy occlusal contacts directly on the 3D model. Another feature can be dual view, where modifications made using 3D Controls may be compared side-by-side with the original set up. Another feature is a Bolton analysis tool that provides reference information pertaining to tooth size discrepancy that is useful for planning how to address tooth interdigitation and arch coordination. Another tool positions the 3D model on a grid that allows linear tooth movements to be measured and provides more precise control to the doctor to make measurements on the 3D model. Another feature is a superimposition toolbar that superimposes tooth position at any stage in relation to tooth position at any other stage, and control which stage is blue (or another color) and which stage is white (or another different color) for a better visualization between stages.
Dual View
In some embodiments, the display can provide the doctor a dual view that shows and compares in one screen or display a first 3D model of the teeth position before the treatment (initial malocclusion) with a second 3D model of the teeth position at any stage of the treatment, such as an intermediate stage or the final stage. For example, FIG. 2A illustrates in the first 3D model 200 on the left side of the display the initial malocclusion, while the right side of the display shows a 3D model 202 of the dentition at the 10th stage of treatment. A toolbar 204 at the bottom of the screen allows the doctor to select the stages to be displayed in dual view. The stages to be viewed can be selected by simply clicking the corresponding button on the toolbar 204. In some embodiments, a default stage that is typically shown is the initial stage that shows the initial malocclusion. In some embodiments, the default stage can be changed to an intermediate stage. For example, the doctor can drag and drop a button representing one of the intermediate stages over the default 3D model shown on the left side of the screen in order to replace the initial stage with an intermediate stage. Using dual view the doctor can understand how the treatment goes from stage to stage in comparison to the initial malocclusion. In some embodiments, the first 3D model can be an intermediate stage and the second 3D model can be a subsequent intermediate stage or the final stage.
Using additional tools in the dual view gives the doctor additional details for the treatment in comparison to the initial malocclusion and teeth position and allows the doctor to view the effect of a particular action on teeth movement on any particular stage and allows the comparison between the initial malocclusion with any stage of the treatment. Any of the tools described herein can be used in dual view to manipulate either of the 3D models shown in dual view. For example, using the occlusal view tool when in dual view gives the doctor the ability to view and compare maxillary and mandibular occlusal view for the initial malocclusion with the maxillary and mandibular occlusal view at any stage of the treatment.
Using other tools (e.g., Attachments, IPR, Pontic) in dual view gives the doctor additional information about the used features for the treatment and in comparison to the initial malocclusion, and the doctor can more easily and quickly analyze how these features work and whether there is a clinical reason to use them for this particular treatment.
FIG. 1A is a diagram showing an example of a treatment plan review and/or modification system 100A. The modules of the system 100A may include one or more engines and datastores. A computer system can be implemented as an engine, as part of an engine or through multiple engines. As used herein, an engine includes one or more processors or a portion thereof. A portion of one or more processors can include some portion of hardware less than all of the hardware comprising any given one or more processors, such as a subset of registers, the portion of the processor dedicated to one or more threads of a multi-threaded processor, a time slice during which the processor is wholly or partially dedicated to carrying out part of the engine's functionality, or the like. As such, a first engine and a second engine can have one or more dedicated processors or a first engine and a second engine can share one or more processors with one another or other engines. Depending upon implementation-specific or other considerations, an engine can be centralized or its functionality distributed. An engine can include hardware, firmware, or software embodied in a computer-readable medium for execution by the processor. The processor transforms data into new data using implemented data structures and methods, such as is described with reference to the figures herein.
The engines described herein, or the engines through which the systems and devices described herein can be implemented, can be cloud-based engines. As used herein, a cloud-based engine is an engine that can run applications and/or functionalities using a cloud-based computing system. All or portions of the applications and/or functionalities can be distributed across multiple computing devices, and need not be restricted to only one computing device. In some embodiments, the cloud-based engines can execute functionalities and/or modules that end users access through a web browser or container application without having the functionalities and/or modules installed locally on the end-users' computing devices.
As used herein, datastores are intended to include repositories having any applicable organization of data, including tables, comma-separated values (CSV) files, traditional databases (e.g., SQL), or other applicable known or convenient organizational formats. Datastores can be implemented, for example, as software embodied in a physical computer-readable medium on a specific-purpose machine, in firmware, in hardware, in a combination thereof, or in an applicable known or convenient device or system. Datastore-associated components, such as database interfaces, can be considered “part of” a datastore, part of some other system component, or a combination thereof, though the physical location and other characteristics of datastore-associated components is not critical for an understanding of the techniques described herein.
Datastores can include data structures. As used herein, a data structure is associated with a particular way of storing and organizing data in a computer so that it can be used efficiently within a given context. Data structures are generally based on the ability of a computer to fetch and store data at any place in its memory, specified by an address, a bit string that can be itself stored in memory and manipulated by the program. Thus, some data structures are based on computing the addresses of data items with arithmetic operations; while other data structures are based on storing addresses of data items within the structure itself. Many data structures use both principles, sometimes combined in non-trivial ways. The implementation of a data structure usually entails writing a set of procedures that create and manipulate instances of that structure. The datastores, described herein, can be cloud-based datastores. A cloud-based datastore is a datastore that is compatible with cloud-based computing systems and engines.
An example of a treatment plan review and/or modification system 100A such as that shown in FIG. 1A may include a computer-readable medium 102, view modification user input engine(s) 104, staging toolbar engine(s) 106, display output engine(s) 108, an untreated 3D model datastore 110, one or more treated 3D model datastore(s) 112, an overcorrection stage engine 114, an informative button engine 116, and an occlusal contact engine 118. One or more of the modules of the system 100A may be coupled to one another (e.g., through the example couplings shown in FIG. 1A) or to modules not explicitly shown in FIG. 1A. The computer-readable medium 102 may include any computer-readable medium, including without limitation a bus, a wired network, a wireless network, or some combination thereof.
The view modification user input engine(s) 104 may implement one or more automated agents configured to receive user input on the position and/or orientation of a displayed 3D model of a subject's teeth. The system may coordinate (e.g., as part of a view coordination engine, not shown) the display of each of the 3D models concurrently being displayed so that changes in one model by the user are reflected in all or some of the other models, including changes in the position, etc. In various implementations, the view modification engine(s) 104 may implement one or more automated agents configured to coordinate (in conjunction with the display output engine(s) 108) the 3D virtual representations of the patient's dental arches.
In some variations, the view modification user input engine includes digital controls (e.g., digital buttons), such as the informative buttons (and may therefore interact with the informative button engine 116). In some variations one or more buttons may include, for example, tooth numbering. Tooth numbering may be determined by the system or may be read as information about the tooth numbering in each 3D model (e.g., stored as part of the 3D model datastores). For example, a 3D model datastore (e.g., untreated 3D model or treated 3D model datastore) may be configured to store one or more tooth type identifiers of different tooth types. In some implementations, the tooth type identifiers correspond to numbers of a Universal Tooth Numbering System, character strings to identify tooth types by anatomy, images or portions thereof to identify tooth types by geometry and/or other characteristics, etc.
The staging toolbar engine 106 coordinates the user selection of one or more stages of the treatment plan represented by one or more of the 3D digital models of the subject's teeth. The stating toolbar engine may map selected staging buttons (e.g., buttons labeled numerically and/or alphanumerically with one or more treatment stage indicators) to the display of a corresponding stage in each of the 3D models (or in comparison to the untreated 3D model).
In general, the display output engine 108 is configured to coordinate the display of 3D model(s) of the subject's teeth with each other and with changes made by the user (e.g., physician, doctor, dentist, dental technician, etc.).
The overcorrection stage engine 114 may determine and/or coordinate display of one or mover overcorrection stages, as will be described in greater detail below.
An informative button engine 116 may coordinate the use of one or more informative buttons that may be modify the information specific to each (or all) of the treatment plan 3D digital models and may process this information into a user-selectable button that both shows the status of the button (e.g., on/off) as well as information that is based on all or a subset of the treatment plans, including that all or some of the treatment plan includes one or more features (e.g., treatment features, such as interproximal reduction (IPR), attachments, hooks, tooth ramps, etc.).
The system may include one or more datastores, including an untreated 3D (digital) model datastore 110 that may store the 3D model of the patient's untreated teeth, e.g., upper and/or lower arch. The 3D model of the patient's untreated teeth may be imported (e.g., from an external file, remote server, etc.), scanned, e.g., using an intraoral scanner from a patient or a model of a patient's teeth and stored, or otherwise acquired by the system. Similarly, the system may include a treated 3D (digital) model datastore 112 for storing one or more 3D models of the patient's teeth during each stage of a treatment plan, including the final stage. The datastore may also store information specific to the treatment plan, including features used to achieve tooth movement (including location on the teeth, etc.), number of stages, etc., or any other meta-information related to the referenced treatment plan. The treated 3D (digital) models may be generated by the system or a separate system and imported/entered into the datastore. Any number of treatment 3D models (1, 2, 3, 4, 5, 6, 7, 8, 9, 10, etc.) may be stored and used, including selection by a user of which ones to show or display.
Any of these systems may also include an occlusal contact engine 118. FIG. 1B shows a schematic example of another type of occlusal contact engine 118. In this example the occlusal contact engine may be invoked by a user command (e.g., selection of a user input such as a digital button, switch, etc.). The occlusal contact engine may include an occlusal view display engine 120 that may receive an instruction to switch between a current view of one or more digital model(s) (e.g., a plurality of concurrently displayed digital models) into a view showing occlusal surfaces of one or both upper and/or lower arch surfaces. The system may translate the current 3D model display(s) into occlusal views, with the upper and lower arch, when both are shown concurrently, arranged with the upper arch above the lower arch, and both spread essentially flat. The occlusal contact engine may also include a collision contact engine 122 for calculating (or receiving from an outside source that has already pre-calculated) the occlusal contact between the teeth of the upper and lower jaws. The collision contact engine may estimate, from the 3D models, the normal intercuspation for each of the upper and lower jaws, and may determine where the intercuspation results in collision or contact between the teeth of the upper jaw and the teeth of the lower jaw. Both location and severity of collision may be estimated by the collision contact engine. If the collisions are predetermined and passed to the occlusal contact engine, they may be stored in an equivalent datastore (e.g., a collation contact datastore, not shown); alternatively the collision contact information calculated by the collision contact engine may be stored in a collision contact datastore. A collision scoring engine 124 may score the extent of the collision between the teeth of the upper and lower arch during intercuspation. The score may be qualitative and/or quantitative. The scoring engine may apply a threshold (e.g., from the collision threshold engine 128) to determine if a collision is mild, extreme, etc. The scoring engine may apply a threshold based on, e.g., a patient set threshold. For example, the collision threshold engine 128 may present a control on the display that the user may adjust to set or change the threshold (this may be reflected dynamically in real time in the display of the occlusal collisions). The occlusal contact engine may also include a collision display engine 126 that coordinates the display of the determined collisions onto the 3D model(s) of the patient's teeth. For the treatment plan 3D models the occlusion(s) may be graphically illustrated on all of the treatment stages or in just the last (e.g., final) treatment stage(s).
Any of these systems may also include a modification engine (not shown) configured to receive user modification of one or more treatment plans, which may be used to submit for the generation of new treatment plan(s). Any of these system may also include a final approval and fabrication engine(s), not shown. An aligner fabrication engine(s) may implement one or more automated agents configured to fabricate an aligner. Examples of an aligner are described in detail in U.S. Pat. No. 5,975,893, and in published PCT application WO 98/58596, which is herein incorporated by reference for all purposes. Systems of dental appliances employing technology described in U.S. Pat. No. 5,975,893 are commercially available from Align Technology, Inc., Santa Clara, Calif., under the tradename, Invisalign System. Throughout the description herein, the use of the terms “orthodontic aligner”, “aligner”, or “dental aligner” is synonymous with the use of the terms “appliance” and “dental appliance” in terms of dental applications. For purposes of clarity, embodiments are hereinafter described within the context of the use and application of appliances, and more specifically “dental appliances.” The aligner fabrication engine(s) 108 may be part of 3D printing systems, thermoforming systems, or some combination thereof.
FIGS. 2B and 2C illustrate a tool button 106, shown in the upper left corner of the display in this embodiment, for toggling the display between dual view and single view. In FIG. 2B, the dual view button 206 is not selected and the stage 10 button is selected in the stage selection toolbar 204, which results in stage 10 of the treatment plan being shown in the display in single view mode. In FIG. 2C, the dual view button 206 has been selected along with stage 10 in the stage selection toolbar 204 to show the initial malocclusion on the left hand side and stage 10 and the right side of the display.
FIG. 3 illustrates dual view combined with an occlusal view that can be selected by toggling an occlusal view button 300. In FIG. 3, the dual view button 206 and the occlusal view button 300 have been selected along with the button for stage 10 in the stage selection toolbar 204. This results in the occlusal contacts of the dentition in the initial malocclusion to be shown and compared with the occlusal contacts of stage 10.
FIG. 4 illustrates that in the dual view mode, the doctor is able to visualize or hide attachments 402 and other aligner features, IPR/space information 404 for the teeth contacts, and pontics using Attach, IPR and Pontic buttons 400 on the toolbar in order to analyze how these features are used for the particular treatment and what these features help to treat and whether addition, removal, or adjustment of any of these features help with the treatment plan. Dual view in this case helps the doctor visualize and analyze this information in comparison to the initial malocclusion view:
In the dual view mode, the doctor is able to switch the right 3D model to any stage of the treatment using staging toolbar 204 at the bottom of the window in order to visualize and compare treatment details of the desired stage to the initial malocclusion (i.e., initial teeth position). As shown in FIG. 4, stage 10 is selected by selecting the #10 button on the staging toolbar 204.
Tools with Additional Indicators
In some embodiments, when the doctor reviews a case with multiple treatment plans available, multiple treatment plans (MTP) can be shown in one screen. The doctor can also select one of the plans to obtain further details of the plan and switch to a single plan view. The various features used or not used in the various treatment plans, such as attachments, IPR/spaces, pontics and/or information about occlusal contacts, can be shown simultaneously in the multiple view mode as shown in FIGS. 5A-5H.
Tools, such as toolbars with icons and button with special indicators, can indicate to the doctor whether a particular feature is present or absent in the plan/plans shown in the screen. Even if the doctor switches some features visualization OFF, he/she will clearly understand by indicator's color whether this feature is used in the treatment or absent. For example, if a feature is used in the treatment, the indicator for that feature (e.g., a circle, square, triangle, or other shaped object on the button) can be colored and filled in, and when the feature is not used, the indicator can be empty and uncolored.
This information is especially useful in the MTP view when the doctor is switching between alternative treatment plans using filters with clinical parameters and features. The state of the tool buttons for the various features allows the doctor to quickly determine whether that feature is present or absent in a particular treatment plan. For example, if in the previous search some features were absent (indicator is empty or uncolored and for example tool is switched OFF), but in the new search this feature is present (the tool button still is switched OFF but the indicator shows that the feature is present now because the indicator becomes colored), the doctor will see a changed indicator and will understand if he needs to review details regarding the changed feature, and the related feature tool may be switched ON to visualize this information on the 3D model.
In FIG. 5A, the attachment filter 508 is set to “yes” which means that attachments are used in the treatment plans, and the attach tool/button 500 and the IPR tool/button 502 is switched ON, which means that information for those features (i.e., attachments 501 and values for interproximal reductions 503) are shown on the 3D models for the treatment plans. In addition, the occlus tool/button 504 and pontic tool/button 506 are switched OFF, meaning that information for those features are hidden and not displayed on the 3D models. Furthermore, the attach tool/button 500 and the IPR tool/button 502 have filled/colored indicators (shown on the button as a filled circle) which means that the treatment plans shown in these views have these features used in the treatment. In contrast, the occlus tool/button 504 and the pontic tool/button 506 have an empty indicator (shown on the button as an empty circle) which means that these features are absent in the shown treatment plans.
In FIG. 5B, the attachment filter 508 has been set to “no” in both treatment plans, which means that attachments aren't used in the treatment plans. Consequently, although the attach tool/button 500 is still switched ON, the attach tool/button 500 indicator is empty, which means that attachments are not used in either plan, which is confirmed by visualization of the 3D models which do not show attachment object anymore.
When the MTP case is opened, in some embodiments by default all features present in any treatment plan selected for the MTP view will be visualized on the 3D models and related tools will be switched ON by default. These tools also have a colored indicator, meaning that these features are used in one of the treatment plans shown or in all plans shown in the MTP view.
FIGS. 5C and 5D illustrate how a particular feature can be hidden or removed from the 3D model even though the feature is present in the treatment plan. This can be useful for simplifying the view of the 3D model when focusing on another feature. For example, in FIG. 5C, the attach button 500 is selected “ON” (indicated by the colored icon on the button) and the attach button indicator is filled which means that attachments 501 are present in the treatment plan and viewable on the 3D model since the attach button 500 is “ON”. As shown in FIG. 5D, to hide the attachments 501 from 3D models, the attach button 500 can be switched “OFF” by clicking the button to toggle the button from one state to another. Note that the attach button indicator is still filled and colored, which means that the attachment features are present in the treatment plans even if their visualization in the 3D models is switched OFF. Similarly, other features can be hidden as well by toggling the feature button to an OFF state.
FIGS. 5E-5G illustrate how filters can be used to select treatment plans using or not using various features, such as attachments and IPR. The use of a feature in a treatment plan can be quickly determined by looking at the feature indicators (filled indicator means feature used and empty indicator means feature not used). For example, in FIG. 5E, the attachment filter 508 is be set to “No”, which causes the attach button 500 indicator to be empty, which means attachments are not being used in the treatment plans. In FIG. 5F, the attachment filter 508 is changed to “Yes”, which causes the attach button 500 indicator to be filled, meaning attachments are used in the treatment plan but are not shown in the model because the attach button 500 is deselected. In FIG. 5G, the attachment filter 508 is changed back to “No”, which causes the attach button 500 indicator to revert back to being empty as in FIG. 5E, meaning attachments are not used in the treatment plans.
In some embodiments, tool button states are not changed when searching/filtering for specific treatment plans. Therefore, if a tool button is in an “OFF” state, it will keep that “OFF” state for the new searched/filtered plans. Similarly, if the tool button is in an “ON” state, it will keep the “ON” state for the new searched/filter plans. However, the indicators for those buttons will change between filled and empty to indicate whether the feature is present or absent from the new searched/filtered treatment plans.
In some embodiments, one or more feature button can have an indicator with more than 2 states, such as 3 states. For example, FIGS. 6A-6C illustrate the 3 states of the occlus button 600. In FIG. 6A, the occlus button 600 has an empty indicator which means that there are not any normal occlusal contacts or heavy interarch collisions. In FIG. 6B, the occlus button 600 has a green indicator which means that the treatment plans shown in the screen have only normal occlusal contacts. In FIG. 6C, the occlus button 600 has a red indicator which means that the treatment plans have heavy inter-arch collisions.
As shown in FIGS. 7A and 7B, when the doctor wants to review details for one of the treatment plans shown in the MTP view and clicks on the view button 700, the selected treatment plan will be opened in the single treatment plan (STP) view. Toolbar buttons state (switched ON or switched OFF) will not be changed but indicators can be changed to show actual information about features availability for this particular plan. For example, in the MTP view in FIG. 7A, only one of the treatment plans had attachments 703 used in the treatment. In the MTP view, the toolbar showed a green indicator for attach button 702 because one of the plans has attachments 703. Switching to the STP view shown in FIG. 7B for the plan which doesn't have attachments used in the treatment, the attach button 702 won't change its state, but its indicator shall become empty because the selected plan does not have this feature used in the treatment.
Visualization of Overcorrection Stages
In some embodiments, as shown in FIGS. 8A and 8B, the doctor can visualize or hide overcorrection stages in the treatment using a tool/button placed next to or integrated into the end of the staging toolbar 800 that represents the treatment stages. In FIG. 8A, the overcorrection stages are hidden and a “+” button 802 can be clicked or selected to unhide and show the overcorrection stages in the staging toolbar 800. In FIG. 8B, the overcorrection stages 804 are visible in the staging toolbar 800 as blue lines (although other colors or patterns can be used to distinguish the overcorrection stages from the other stages). The doctor can review the details of the overcorrection stages 804 or the normal stages by selecting the stage using the staging toolbar 800. An “X” button 806 can be clicked or selected to hide the overcorrection stages.
In some embodiments, an overcorrection technique is used for example for virtual c-chain with aligners which simulates the effect of using elastic c-chains in bracket and wire treatments. In some embodiments, a treatment form can be displayed on the screen as shown in FIG. 9 to the doctor as a prompt or reminder for asking about and/or using overcorrection stages. In some embodiments, overcorrection stages are used to provide additional (extra) forces for specific tooth movements. In some embodiments, in the overcorrection stages teeth continue moving in the same direction as originally planned. They are requested for some particular teeth movements in order to achieve ideal treatment results.
Occlusal View and Occlusal Contacts Visualization
As shown in FIG. 10, the system can display on a screen an occlusal view of the subject's dentition in an open mouth configuration where the upper arch 1000 is shown on the top of the screen and the lower arch 1002 is shown at the bottom of the screen, and this view can be combined and overlaid with the inter-arch contacts 1004 visualization which are shown in the figure as colored (e.g., green for normal occlusal contacts and red for heavy inter-arch collisions) areas on the teeth. Using this view the doctor can clearly see teeth alignment for both arches in the initial teeth position, shown in FIG. 10 on the left, in final teeth position and in any stage of the treatment, shown in FIG. 10 in the middle and on the right. In addition to this information the doctor also can visualize and investigate inter-arch contacts on both arches simultaneously.
Occlusal contacts 1004 visualization on the initial teeth position are very important and may be critical for doctors to verify that the bite has been set correctly. After this feature implementation the doctors have a special tool to check that the initial bite setup is correct based on the pattern of occlusal contacts.
Occlusal contacts visualization on the final teeth position gives the doctor an understanding of whether the treatment will be efficacious and whether the subject will have or not have heavy inter-arch contact collisions after the treatment. Based on this information the system or doctor can decide whether the treatment requires modifications to fix such problems or whether the treatment is satisfactory and can be continued.
As shown in FIG. 10, in multiple treatment plans view the doctor is also able to visualize and compare under the occlusal views the inter-arch contacts for the initial malocclusion and for the final (or intermediate) teeth positions for two different treatment plans that are selected for comparison.
The combination of the occlusal view with the inter-arch contacts visualization gives the doctor the ability to check both arches teeth alignment and analyze inter-arch contacts information simultaneously.
Such views allow the doctor to check inter-arch contacts in the initial teeth position to check if the initial bite setup was done correctly or not.
Such views allow the doctor to check inter-arch contacts in the final teeth position in order to make sure that the treatment is proper and efficacious and does not have any major issues that would prohibit going forward with the treatment plan. Otherwise, if for example heavy inter-arch contacts are present, the doctor is able to modify the treatment plan to fix and eliminate such issues from the approved treatment plan.
In the multiple treatment plans view the doctor is able to view and compare the occlusal view and inter-arch contacts in initial teeth position and in the final teeth position for two different treatment plans that are selected for comparison.
In the single treatment plan view the doctor is also able to view inter-arch contacts on any stage of the treatment. This additional information gives the doctor a fuller understanding of what will happen with the inter-arch contacts during the course of the treatment and whether the treatment plan should be modified or corrected in the middle of the treatment or before beginning treatment.
When the case has heavy inter-arch collisions, the system can warn the doctor about the heavy inter-arch collisions. For example, a tool which is used to switch occlusal view with inter-arch contacts visualization can have a special red indicator, notifying the doctor about heavy collisions; or it can have a green indicator if the case has only normal occlusal contacts.
As shown in the multiple treatment plans views in FIGS. 11A and 11B, for switching to the special occlusal view with inter-arch contacts visualization the doctor can use an “OCCLUS” button 1100 on the feature toolbar 1102 that can be displayed on the top portion of the screen or along another screen edge. FIG. 11A shows a closed mouth view with the occlus button 1100 not selected. A filled green indicator on the occlus button indicates that there are normal occlusal contacts through the treatment and that no heavy inter-arch collisions are present during the course of the treatment. Clicking or otherwise selecting the occlus button 1100 changes the screen to the occlus view as shown in FIG. 11B with all 3D models changed to the occlusal view with the occlusal contacts visualization enabled. With the occlusal view, the doctor is able to compare teeth alignment for both arches for different treatment plans selected in MTP view and compare it with the initial teeth position shown on the initial malocclusion 3D model.
As shown in FIG. 12, the 3D models can be simultaneously rotated by selecting the rotate button 1200. Rotating the 3D models allows the doctor to view and analyze any side of the 3D model and check inter-arch contacts in detail if needed. Rotating one 3D model will rotate the others in the MTP view simultaneously, with all 3D models presented at the same angle and perspective. This allows the doctor to check differences in all the selected treatment plans and the initial malocclusion right away in one screen by comparing visualized information from any side on all 3D models shown in that view.
As shown in FIGS. 13A-13C, when the doctor is in a single treatment plan (STP) view (when only one treatment plan is available for review with additional details like staging), the treatment stages are viewable via the staging toolbar 1300 and the occlus button 1302 can be used to check/review/analyze inter-arch collisions/contacts 1304 on any stage of the treatment. FIG. 13A illustrates a STP view of the initial teeth position with the occlusal view switched on. FIG. 13B illustrates a STP view of the final teeth position with the occlusal view switched on. FIG. 13C illustrates a STP view of a middle stage of the treatment with the occlusal view switched on. However, in some embodiments, as shown in FIG. 13C, the occlusal contacts are not shown in any of the middle stages. In other embodiments, the occlusal contacts 1304 are also shown in the middle stages.
As shown in FIGS. 14A and 14B, by clicking or selecting the dual view button 1400 in the STP view the doctor also is able to switch to dual view where two 3D models are visualized simultaneously: the left one with the 3D model for initial malocclusion; the right one with the treatment plan and stage selected. In this view, as shown in FIG. 14A, with the final stage 1402 selected and occlusal view 1404 switched on, the doctor can compare teeth alignment before and after the treatment and see how inter-arch contacts are changed. In FIG. 14A, the heavy inter-arch collisions 1406 are shown in red and the normal occlusal contacts 1408 are shown in green. In the dual view mode the doctor is also able to switch the right 3D model to any stage of the treatment by selecting the desired stage using the stage toolbar 1401. When occlusal view is switched on in this case, the doctor can also compare details for the teeth alignment and inter-arch contacts for initial teeth position and any stage of the treatment. In FIG. 14B, the stage toolbar 1401 has been used to select stage 10 1403 with the occlusal view switched on. In some embodiments, as shown in FIG. 14B, the inter-arch contacts are not shown in the middle stages, while in other embodiments, the inter-arch contacts are shown for the middle stages.
Examples
FIGS. 1C and 1D illustrate examples of methods, e.g., methods of treatment plan review and/or modification as described herein and illustrated above. In FIG. 1C, the method 130 may include initially receiving or generating a 3D digital model of the patient's teeth in an initial (untreated) confirmation, and one or more 3D digital models of the patient's teeth following a treatment plan (or plans). The digital models may be based on a digital scan of the patient's dentition (e.g., from an intraoral scan and/or a scan of a dental impression). The method may also include displaying a staging toolbar on a first portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of three dimensional (3D) digital models of a subject's dentition, wherein the plurality of 3D models includes a first model that shows an arrangement of a subject's teeth before receiving a treatment, one or more intermediate models that show an arrangement of the subject's teeth during a stage of the treatment, and a final model that shows an arrangement of the subject's teeth after receiving the treatment 132. Thereafter the method may include displaying a displayed 3D model corresponding to one of the plurality of 3D models 134, and changing the displayed 3D model to correspond to whichever digital button is selected by a user 136 (e.g., adjusting a view of the displayed 3D model shown on the display based on a user input, wherein the view of the displayed 3D model is applied to the changed displayed 3D model as the user selects the digital buttons). Optionally, the method may include displaying a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate the presence of the treatment feature in the first orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model 138. The method may also include receiving any modifications to the treatment plan from the user 139. These modifications may then be used to generate a new or modified treatment plan.
FIG. 1D illustrates another example of a method of treatment plan review and/or modification. In FIG. 1D, the method includes indicating occlusal contact severity. For example, as shown in FIG. 1D, the method includes displaying, on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, and a second 3D model that shows an arrangement of the subject's teeth subject to a first orthodontic treatment 142. As mentioned the method may also include retrieving, receiving or otherwise generating the 3D digital model(s). The method may further include displaying the second 3D model at either a final stage or an intermediate treatment stage of the first orthodontic treatment 144, and adjusting a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model 146. The method may include receiving the collisions between the upper and lower arches in each of the untreated and one or more treated 3D models; optionally the method may include calculating collisions between upper and lower arches of the treated and/or untreated 3D models 148.
The one or more regions of inter-arch collisions may be indicated on either or both of the first 3D model and the second 3D model 152. This may be indicated in color, by label, etc., as described above.
Once the occlusive view is selected by the user, the method or system may then switch the view of both the first 3D model and the second 3D model to an occlusal view (e.g., when the user selects a control, wherein the occlusal view shows occlusal surfaces of teeth in the first 3D model and the second 3D model) 150. The system may then optionally receive, from the user, modifications to one or more treatment plans, such as further treatment instructions 152.
When a feature or element is herein referred to as being “on” another feature or element, it can be directly on the other feature or element or intervening features and/or elements may also be present. In contrast, when a feature or element is referred to as being “directly on” another feature or element, there are no intervening features or elements present. It will also be understood that, when a feature or element is referred to as being “connected”, “attached” or “coupled” to another feature or element, it can be directly connected, attached or coupled to the other feature or element or intervening features or elements may be present. In contrast, when a feature or element is referred to as being “directly connected”, “directly attached” or “directly coupled” to another feature or element, there are no intervening features or elements present. Although described or shown with respect to one embodiment, the features and elements so described or shown can apply to other embodiments. It will also be appreciated by those of skill in the art that references to a structure or feature that is disposed “adjacent” another feature may have portions that overlap or underlie the adjacent feature.
Terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. For example, as used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items and may be abbreviated as “/”.
Spatially relative terms, such as “under”, “below”, “lower”, “over”, “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is inverted, elements described as “under” or “beneath” other elements or features would then be oriented “over” the other elements or features. Thus, the exemplary term “under” can encompass both an orientation of over and under. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. Similarly, the terms “upwardly”, “downwardly”, “vertical”, “horizontal” and the like are used herein for the purpose of explanation only unless specifically indicated otherwise.
Although the terms “first” and “second” may be used herein to describe various features/elements (including steps), these features/elements should not be limited by these terms, unless the context indicates otherwise. These terms may be used to distinguish one feature/element from another feature/element. Thus, a first feature/element discussed below could be termed a second feature/element, and similarly, a second feature/element discussed below could be termed a first feature/element without departing from the teachings of the present invention.
Throughout this specification and the claims which follow, unless the context requires otherwise, the word “comprise”, and variations such as “comprises” and “comprising” means various components can be co-jointly employed in the methods and articles (e.g., compositions and apparatuses including device and methods). For example, the term “comprising” will be understood to imply the inclusion of any stated elements or steps but not the exclusion of any other elements or steps.
As used herein in the specification and claims, including as used in the examples and unless otherwise expressly specified, all numbers may be read as if prefaced by the word “about” or “approximately,” even if the term does not expressly appear. The phrase “about” or “approximately” may be used when describing magnitude and/or position to indicate that the value and/or position described is within a reasonable expected range of values and/or positions. For example, a numeric value may have a value that is +/−0.1% of the stated value (or range of values), +/−1% of the stated value (or range of values), +/−2% of the stated value (or range of values), +/−5% of the stated value (or range of values), +/−10% of the stated value (or range of values), etc. Any numerical values given herein should also be understood to include about or approximately that value, unless the context indicates otherwise. For example, if the value “10” is disclosed, then “about 10” is also disclosed. Any numerical range recited herein is intended to include all sub-ranges subsumed therein. It is also understood that when a value is disclosed that “less than or equal to” the value, “greater than or equal to the value” and possible ranges between values are also disclosed, as appropriately understood by the skilled artisan. For example, if the value “X” is disclosed the “less than or equal to X” as well as “greater than or equal to X” (e.g., where X is a numerical value) is also disclosed. It is also understood that the throughout the application, data is provided in a number of different formats, and that this data, represents endpoints and starting points, and ranges for any combination of the data points. For example, if a particular data point “10” and a particular data point “15” are disclosed, it is understood that greater than, greater than or equal to, less than, less than or equal to, and equal to 10 and 15 are considered disclosed as well as between 10 and 15. It is also understood that each unit between two particular units are also disclosed. For example, if 10 and 15 are disclosed, then 11, 12, 13, and 14 are also disclosed.
Although various illustrative embodiments are described above, any of a number of changes may be made to various embodiments without departing from the scope of the invention as described by the claims. For example, the order in which various described method steps are performed may often be changed in alternative embodiments, and in other alternative embodiments one or more method steps may be skipped altogether. Optional features of various device and system embodiments may be included in some embodiments and not in others. Therefore, the foregoing description is provided primarily for exemplary purposes and should not be interpreted to limit the scope of the invention as it is set forth in the claims.
The examples and illustrations included herein show, by way of illustration and not of limitation, specific embodiments in which the subject matter may be practiced. As mentioned, other embodiments may be utilized and derived there from, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Such embodiments of the inventive subject matter may be referred to herein individually or collectively by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept, if more than one is, in fact, disclosed. Thus, although specific embodiments have been illustrated and described herein, any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.

Claims (22)

What is claimed is:
1. A method, the method comprising:
receiving a first orthodontic treatment plan comprising a plurality of stages, where each stage comprises a different arrangement of a subject's teeth;
selecting a stage from the first orthodontic treatment plan to be displayed;
displaying, on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, adjacent to a second 3D model that shows an arrangement of the subject's teeth in the selected stage, subject to the first orthodontic treatment;
wherein the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment;
adjusting, simultaneously, a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model;
switching the view of both the first 3D model and the second 3D model to an occlusal view when the user selects a control, wherein the occlusal view shows occlusal surfaces of teeth in the first 3D model and the second 3D model, further wherein the occlusal view indicates one or more regions of inter-arch collisions on either or both of the first 3D model and the second 3D model.
2. The method of claim 1, wherein the occlusal view indicates one or more regions of inter-arch collisions using an indicator that is scaled to differentiate a relative degree of contact between an upper arch and a lower arch.
3. The method of claim 2, wherein the indicator is colored differently to differentiate regions of normal contact from regions of high contact.
4. The method of claim 1, further comprising calculating regions of inter-arch collision on the first 3D model and calculating regions of inter-arch collision on region on the second 3D model.
5. The method of claim 1, further comprising displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages of the first orthodontic treatment; and changing the displayed stage of the second 3D model to correspond to a stage selected by the user from the staging toolbar.
6. The method of claim 1, wherein displaying comprises displaying side-by-side.
7. The method of claim 1, wherein displaying comprises displaying an upper arch engaged with a lower arch of the first 3D model and displaying an upper arch engaged with a lower arch of the second 3D model.
8. The method of claim 1, further comprising displaying a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate a presence of a treatment feature in the first orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model.
9. The method of claim 8, wherein the treatment features comprise: tooth numbering, attachments, interproximal reduction spacing, and pontics.
10. The method of claim 1, wherein adjusting the view of both of the first 3D model and the second 3D model based on the user input comprises modifying one or more of: a rotation, a zoom, or a pan of the first 3D model and the second 3D model.
11. The method of claim 1, wherein the user input includes selecting from a set of preset views.
12. A method, the method comprising:
receiving a first orthodontic treatment plan comprising a first plurality of stages and a second orthodontic treatment plan comprising a second plurality of stages, where each stage comprises a different arrangement of a subject's teeth in the first or second orthodontic treatment plan;
selecting a stage from the first orthodontic treatment plan to be displayed and a stage from the second orthodontic treatment plan to be displayed;
displaying, on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, adjacent to a second 3D model that shows an arrangement of the subject's teeth in the selected stage from the first orthodontic treatment plan, and adjacent to a third 3D model that shows an arrangement of the subject's teeth in the selected stage from the second orthodontic treatment plan;
wherein the second 3D model and third 3D model are displayed at either a final stage or an intermediate treatment stage of the first and second orthodontic treatment plans;
adjusting a view of the first 3D model, the second 3D model, and the third 3D model based on a user input modifying a view of one of the first, second or third 3D models wherein adjusting the view comprises adjusting one or more of a rotation, a zoom and a pan;
switching the view of each of the first, second and third 3D models to an occlusal view when the user selects a control, wherein the occlusal view shows occlusal surfaces of teeth in the first, second and third 3D models, further wherein the occlusal view indicates one or more regions of inter-arch collisions on each of the first, second and third 3D models, using an indicator that is scaled to differentiate a relative degree of contact between an upper arch and a lower arch.
13. The method of claim 12, wherein the indicator is colored differently to differentiate regions of normal contact from regions of high contact.
14. The method of claim 12, further comprising calculating regions of inter-arch collision on the first 3D model and calculating regions of inter-arch collision on region on the second 3D model.
15. The method of claim 12, further comprising displaying a staging toolbar on a portion of a display, wherein the staging toolbar comprises a plurality of digital buttons that correspond to a plurality of treatment stages of the first orthodontic treatment; and changing the displayed stage of the second 3D model to correspond to a stage selected by the user from the staging toolbar.
16. The method of claim 12, wherein displaying comprises displaying side-by-side.
17. The method of claim 12, wherein displaying comprises displaying the upper arch engaged with the lower arch of the first 3D model and displaying the upper arch engaged with the lower arch of the second 3D model.
18. The method of claim 12, further comprising displaying a plurality of buttons corresponding to treatment features, wherein the buttons visually indicate a presence of a treatment feature in the first orthodontic treatment and further wherein the buttons visually indicate that that treatment feature is actively being displayed on either or both the first 3D model and the second 3D model.
19. The method of claim 18, wherein the treatment features comprise: tooth numbering, attachments, interproximal reduction spacing, and pontics.
20. The method of claim 12, wherein adjusting the view of both of the first 3D model and the second 3D model based on the user input comprises modifying one or more of: a rotation, a zoom, or a pan of the first 3D model and the second 3D model.
21. The method of claim 12, wherein the user input includes selecting from a set of preset views.
22. A system for visualizing a subject's teeth, the system comprising:
one or more processors;
a memory coupled to the one or more processors, the memory configured to store computer-program instructions, that, when executed by the one or more processors, perform a computer-implemented method comprising:
receiving a first orthodontic treatment plan comprising a plurality of stages, where each stage comprises a different arrangement of the subject's teeth;
selecting a stage from the first orthodontic treatment plan to be displayed;
displaying, on a display, a first three dimensional (3D) model of a subject's dentition that shows an arrangement of the subject's teeth before receiving a treatment, adjacent to a second 3D model that shows an arrangement of the subject's teeth in the selected stage, subject to the first orthodontic treatment;
wherein the second 3D model is displayed at either a final stage or an intermediate treatment stage of the first orthodontic treatment;
adjusting, simultaneously, a view of both of the first 3D model and the second 3D model based on a user input modifying a view of one of the first 3D model or the second 3D model;
switching the view of both the first 3D model and the second 3D model to an occlusal view when the user selects a control, wherein the occlusal view shows occlusal surfaces of teeth in the first 3D model and the second 3D model, further wherein the occlusal view indicates one or more regions of inter-arch collisions on either or both of the first 3D model and the second 3D model.
US16/457,754 2018-06-29 2019-06-28 Digital treatment planning by modeling inter-arch collisions Active US10996813B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US16/457,754 US10996813B2 (en) 2018-06-29 2019-06-28 Digital treatment planning by modeling inter-arch collisions
US17/246,547 US11449191B2 (en) 2018-06-29 2021-04-30 Digital treatment planning by modeling inter-arch collisions
US17/945,957 US11809214B2 (en) 2018-06-29 2022-09-15 Systems for visualizing teeth and treatment planning
US18/472,209 US12067210B2 (en) 2018-06-29 2023-09-21 Methods and systems for visualizing teeth and treatment planning
US18/770,614 US20240361879A1 (en) 2018-06-29 2024-07-11 Systems for visualizing teeth and treatment planning

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862692538P 2018-06-29 2018-06-29
US16/457,754 US10996813B2 (en) 2018-06-29 2019-06-28 Digital treatment planning by modeling inter-arch collisions

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/246,547 Continuation US11449191B2 (en) 2018-06-29 2021-04-30 Digital treatment planning by modeling inter-arch collisions

Publications (2)

Publication Number Publication Date
US20200004402A1 US20200004402A1 (en) 2020-01-02
US10996813B2 true US10996813B2 (en) 2021-05-04

Family

ID=69007571

Family Applications (5)

Application Number Title Priority Date Filing Date
US16/457,754 Active US10996813B2 (en) 2018-06-29 2019-06-28 Digital treatment planning by modeling inter-arch collisions
US17/246,547 Active US11449191B2 (en) 2018-06-29 2021-04-30 Digital treatment planning by modeling inter-arch collisions
US17/945,957 Active US11809214B2 (en) 2018-06-29 2022-09-15 Systems for visualizing teeth and treatment planning
US18/472,209 Active US12067210B2 (en) 2018-06-29 2023-09-21 Methods and systems for visualizing teeth and treatment planning
US18/770,614 Pending US20240361879A1 (en) 2018-06-29 2024-07-11 Systems for visualizing teeth and treatment planning

Family Applications After (4)

Application Number Title Priority Date Filing Date
US17/246,547 Active US11449191B2 (en) 2018-06-29 2021-04-30 Digital treatment planning by modeling inter-arch collisions
US17/945,957 Active US11809214B2 (en) 2018-06-29 2022-09-15 Systems for visualizing teeth and treatment planning
US18/472,209 Active US12067210B2 (en) 2018-06-29 2023-09-21 Methods and systems for visualizing teeth and treatment planning
US18/770,614 Pending US20240361879A1 (en) 2018-06-29 2024-07-11 Systems for visualizing teeth and treatment planning

Country Status (1)

Country Link
US (5) US10996813B2 (en)

Cited By (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11151753B2 (en) 2018-09-28 2021-10-19 Align Technology, Inc. Generic framework for blurring of colors for teeth in generated images using height map
US20210375031A1 (en) * 2019-02-15 2021-12-02 Medit Corp. Method for replaying scanning process
US11232867B2 (en) 2008-05-23 2022-01-25 Align Technology, Inc. Smile designer
US11232573B2 (en) 2019-09-05 2022-01-25 Align Technology, Inc. Artificially intelligent systems to manage virtual dental models using dental images
US20220114785A1 (en) * 2019-07-09 2022-04-14 Panasonic Intellectual Property Management Co., Ltd. Three-dimensional model generation method and three-dimensional model generation device
US11357598B2 (en) 2019-04-03 2022-06-14 Align Technology, Inc. Dental arch analysis and tooth numbering
US11376100B2 (en) 2009-08-21 2022-07-05 Align Technology, Inc. Digital dental modeling
US20220218438A1 (en) * 2021-01-14 2022-07-14 Orthosnap Corp. Creating three-dimensional (3d) animation
US11395717B2 (en) 2018-06-29 2022-07-26 Align Technology, Inc. Visualization of clinical orthodontic assets and occlusion contact shape
US11452577B2 (en) 2018-07-20 2022-09-27 Align Technology, Inc. Generation of synthetic post treatment images of teeth
US11464604B2 (en) 2018-06-29 2022-10-11 Align Technology, Inc. Dental arch width measurement tool
US11534272B2 (en) 2018-09-14 2022-12-27 Align Technology, Inc. Machine learning scoring system and methods for tooth position assessment
US11654001B2 (en) 2018-10-04 2023-05-23 Align Technology, Inc. Molar trimming prediction and validation using machine learning
US11666416B2 (en) 2018-06-29 2023-06-06 Align Technology, Inc. Methods for simulating orthodontic treatment
US11672629B2 (en) 2018-05-21 2023-06-13 Align Technology, Inc. Photo realistic rendering of smile image after treatment
US11678956B2 (en) 2012-11-19 2023-06-20 Align Technology, Inc. Filling undercut areas of teeth relative to axes of appliance placement
US11678954B2 (en) 2012-05-22 2023-06-20 Align Technology, Inc. Adjustment of tooth position in a virtual dental model
US11707344B2 (en) 2019-03-29 2023-07-25 Align Technology, Inc. Segmentation quality assessment
US11717381B2 (en) 2006-08-30 2023-08-08 Align Technology, Inc. Methods for tooth collision detection and avoidance in orthodontic treament
US11723749B2 (en) 2015-08-20 2023-08-15 Align Technology, Inc. Photograph-based assessment of dental treatments and procedures
US11737852B2 (en) 2008-03-25 2023-08-29 Align Technology, Inc. Computer-implemented method of smoothing a shape of a tooth model
US11751974B2 (en) 2018-05-08 2023-09-12 Align Technology, Inc. Automatic ectopic teeth detection on scan
US11759291B2 (en) 2018-05-22 2023-09-19 Align Technology, Inc. Tooth segmentation based on anatomical edge information
US11766311B2 (en) 2007-06-08 2023-09-26 Align Technology, Inc. Treatment progress tracking and recalibration
US11771526B2 (en) 2019-01-03 2023-10-03 Align Technology, Inc. Systems and methods for nonlinear tooth modeling
US11790643B2 (en) 2017-11-07 2023-10-17 Align Technology, Inc. Deep learning for tooth detection and evaluation
US11800216B2 (en) 2020-07-23 2023-10-24 Align Technology, Inc. Image based orthodontic treatment refinement
US11801121B2 (en) 2018-06-29 2023-10-31 Align Technology, Inc. Methods for generating composite images of a patient
US11805991B2 (en) 2017-02-13 2023-11-07 Align Technology, Inc. Cheek retractor and mobile device holder
US11819375B2 (en) 2016-11-04 2023-11-21 Align Technology, Inc. Methods and apparatuses for dental images
US11819377B2 (en) 2007-06-08 2023-11-21 Align Technology, Inc. Generating 3D models of a patient's teeth based on 2D teeth images
US11842437B2 (en) 2018-09-19 2023-12-12 Align Technology, Inc. Marker-less augmented reality system for mammoplasty pre-visualization
US11864969B2 (en) 2011-05-13 2024-01-09 Align Technology, Inc. Prioritization of three dimensional dental elements
US11864971B2 (en) 2017-03-20 2024-01-09 Align Technology, Inc. Generating a virtual patient depiction of an orthodontic treatment
US11864970B2 (en) 2020-11-06 2024-01-09 Align Technology, Inc. Accurate method to determine center of resistance for 1D/2D/3D problems
US11872102B2 (en) 2017-01-24 2024-01-16 Align Technology, Inc. Updating an orthodontic treatment plan during treatment
US11883255B2 (en) 2008-12-30 2024-01-30 Align Technology, Inc. Method and system for dental visualization
US11903793B2 (en) 2019-12-31 2024-02-20 Align Technology, Inc. Machine learning dental segmentation methods using sparse voxel representations
US11957531B2 (en) 2017-12-15 2024-04-16 Align Technology, Inc. Orthodontic systems for monitoring treatment
US11957532B2 (en) 2012-12-19 2024-04-16 Align Technology, Inc. Creating a digital dental model of a patient's teeth using interproximal information
US11957536B2 (en) 2017-01-31 2024-04-16 Swift Health Systems Inc. Hybrid orthodontic archwires
US11986369B2 (en) 2012-03-01 2024-05-21 Align Technology, Inc. Methods and systems for determining a dental treatment difficulty in digital treatment planning
US11992382B2 (en) 2017-10-05 2024-05-28 Align Technology, Inc. Virtual fillers for virtual models of dental arches
US11998410B2 (en) 2017-07-27 2024-06-04 Align Technology, Inc. Tooth shading, transparency and glazing
US12042354B2 (en) 2019-03-01 2024-07-23 Swift Health Systems Inc. Indirect bonding trays with bite turbo and orthodontic auxiliary integration
US12048605B2 (en) 2020-02-11 2024-07-30 Align Technology, Inc. Tracking orthodontic treatment using teeth images
US12048606B2 (en) 2015-02-23 2024-07-30 Align Technology, Inc. Systems for treatment planning with overcorrection
US12053345B2 (en) 2021-09-03 2024-08-06 Swift Health Systems Inc. Method of administering adhesive to bond orthodontic brackets
US12053346B2 (en) 2019-10-31 2024-08-06 Swift Health Systems Inc. Indirect orthodontic bonding systems and methods
US12064310B2 (en) 2017-08-17 2024-08-20 Align Technology, Inc. Systems, methods, and apparatus for correcting malocclusions of teeth
US12064311B2 (en) 2019-05-14 2024-08-20 Align Technology, Inc. Visual presentation of gingival line generated based on 3D tooth model
US12076207B2 (en) 2020-02-05 2024-09-03 Align Technology, Inc. Systems and methods for precision wing placement
US12086964B2 (en) 2019-12-04 2024-09-10 Align Technology, Inc. Selective image modification based on sharpness metric and image domain
US12090025B2 (en) 2020-06-11 2024-09-17 Swift Health Systems Inc. Orthodontic appliance with non-sliding archform
USD1043994S1 (en) 2022-01-06 2024-09-24 Swift Health Systems Inc. Archwire
US12106845B2 (en) 2019-11-05 2024-10-01 Align Technology, Inc. Clinically relevant anonymization of photos and video
US12109089B2 (en) 2010-04-30 2024-10-08 Align Technology, Inc. Individualized orthodontic treatment index
US12125581B2 (en) 2020-02-20 2024-10-22 Align Technology, Inc. Medical imaging data compression and extraction on client side
US12181857B2 (en) 2011-07-29 2024-12-31 Align Technology, Inc. Systems and methods for tracking teeth movement during orthodontic treatment
US12193905B2 (en) 2019-03-25 2025-01-14 Align Technology, Inc. Prediction of multiple treatment settings
US12193908B2 (en) 2021-09-03 2025-01-14 Swift Health Systems, Inc. Orthodontic appliance with non-sliding archform
US12213855B2 (en) 2017-11-01 2025-02-04 Align Technology, Inc. Methods of manufacturing dental aligners
US12220288B2 (en) 2021-10-27 2025-02-11 Align Technology, Inc. Systems and methods for orthodontic and restorative treatment planning
US12220294B2 (en) 2017-04-21 2025-02-11 Swift Health Systems Inc. Indirect bonding trays, non-sliding orthodontic appliances, and registration systems for use thereof
USD1063077S1 (en) 2021-04-26 2025-02-18 Align Technology, Inc. Dental imaging attachment for a smartphone
US12232924B2 (en) 2017-08-15 2025-02-25 Align Technology, Inc. Buccal corridor assessment and computation
US12268571B2 (en) 2021-03-12 2025-04-08 Swift Health Systems Inc. Indirect orthodontic bonding systems and methods
US12279925B2 (en) 2012-10-30 2025-04-22 University Of Southern California Orthodontic appliance with snap fitted, non- sliding archwire

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10996813B2 (en) * 2018-06-29 2021-05-04 Align Technology, Inc. Digital treatment planning by modeling inter-arch collisions
US20220257341A1 (en) * 2019-07-18 2022-08-18 3M Innovative Properties Company Virtual articulation in orthodontic and dental treatment planning
US20230218371A1 (en) * 2020-06-03 2023-07-13 3M Innovative Properties Company Display of multiple automated orthodontic treatment options
CN112116721B (en) * 2020-09-21 2023-11-28 雅客智慧(北京)科技有限公司 Three-dimensional model marking method, device, electronic equipment and storage medium
KR102413697B1 (en) * 2020-12-18 2022-06-28 오스템임플란트 주식회사 Digital tooth set up method and apparatus using graphic user interface for tooth set up
EP4346690A1 (en) * 2021-06-01 2024-04-10 Align Technology, Inc. Automated management of clinical modifications to treatment plans using three-dimensional controls
EP4452125A1 (en) * 2021-12-23 2024-10-30 Hirsch Dynamics Holding AG A system for visualizing at least one three-dimensional virtual model of at least part of a dentition
EP4483375A1 (en) * 2022-02-25 2025-01-01 Solventum Intellectual Properties Company Systems and methods for visualization of oral care treatment timeline

Citations (139)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5975893A (en) * 1997-06-20 1999-11-02 Align Technology, Inc. Method and system for incrementally moving teeth
US6227850B1 (en) 1999-05-13 2001-05-08 Align Technology, Inc. Teeth viewing system
US6227851B1 (en) 1998-12-04 2001-05-08 Align Technology, Inc. Manipulable dental model system for fabrication of a dental appliance
US6299440B1 (en) 1999-01-15 2001-10-09 Align Technology, Inc System and method for producing tooth movement
US6318994B1 (en) 1999-05-13 2001-11-20 Align Technology, Inc Tooth path treatment plan
US6371761B1 (en) 2000-03-30 2002-04-16 Align Technology, Inc. Flexible plane for separating teeth models
US6386878B1 (en) 2000-08-16 2002-05-14 Align Technology, Inc. Systems and methods for removing gingiva from teeth
US6406292B1 (en) 1999-05-13 2002-06-18 Align Technology, Inc. System for determining final position of teeth
US6471512B1 (en) * 1999-11-30 2002-10-29 Ora Metrix, Inc. Method and apparatus for determining and monitoring orthodontic treatment
US6488499B1 (en) 2000-04-25 2002-12-03 Align Technology, Inc. Methods for correcting deviations in preplanned tooth rearrangements
US20030009252A1 (en) * 2000-02-17 2003-01-09 Align Technology, Inc. Efficient data representation of teeth model
US6514074B1 (en) 1999-05-14 2003-02-04 Align Technology, Inc. Digitally modeling the deformation of gingival
US6582229B1 (en) 2000-04-25 2003-06-24 Align Technology, Inc. Methods for modeling bite registration
US20030143509A1 (en) 2002-01-29 2003-07-31 Cadent, Ltd. Method and system for assisting in applying an orthodontic treatment
US6621491B1 (en) 2000-04-27 2003-09-16 Align Technology, Inc. Systems and methods for integrating 3D diagnostic data
US20030207227A1 (en) 2002-05-02 2003-11-06 Align Technology, Inc. Systems and methods for treating patients
US6726478B1 (en) 2000-10-30 2004-04-27 Align Technology, Inc. Systems and methods for bite-setting teeth models
US6767208B2 (en) 2002-01-10 2004-07-27 Align Technology, Inc. System and method for positioning teeth
US20040152036A1 (en) 2002-09-10 2004-08-05 Amir Abolfathi Architecture for treating teeth
US6783360B2 (en) 2000-12-13 2004-08-31 Align Technology, Inc. Systems and methods for positioning teeth
US20040259049A1 (en) 2003-06-17 2004-12-23 Avi Kopelman Method and system for selecting orthodontic appliances
US20050182654A1 (en) 2004-02-14 2005-08-18 Align Technology, Inc. Systems and methods for providing treatment planning
US20050244791A1 (en) 2004-04-29 2005-11-03 Align Technology, Inc. Interproximal reduction treatment planning
US20060127852A1 (en) 2004-12-14 2006-06-15 Huafeng Wen Image based orthodontic treatment viewing system
US20060127836A1 (en) 2004-12-14 2006-06-15 Huafeng Wen Tooth movement tracking system
US20060127854A1 (en) 2004-12-14 2006-06-15 Huafeng Wen Image based dentition record digitization
US7074039B2 (en) 2002-05-02 2006-07-11 Cadent Ltd. Method and system for assessing the outcome of an orthodontic treatment
US7074038B1 (en) 2000-12-29 2006-07-11 Align Technology, Inc. Methods and systems for treating teeth
US7077647B2 (en) 2002-08-22 2006-07-18 Align Technology, Inc. Systems and methods for treatment analysis by teeth matching
US20060275731A1 (en) 2005-04-29 2006-12-07 Orthoclear Holdings, Inc. Treatment of teeth by aligners
US20060275736A1 (en) 2005-04-22 2006-12-07 Orthoclear Holdings, Inc. Computer aided orthodontic treatment planning
US20070238065A1 (en) * 2004-02-27 2007-10-11 Align Technology, Inc. Method and System for Providing Dynamic Orthodontic Assessment and Treatment Profiles
US7293988B2 (en) 2004-12-14 2007-11-13 Align Technology, Inc. Accurately predicting and preventing interference between tooth models
US7309230B2 (en) 2004-12-14 2007-12-18 Align Technology, Inc. Preventing interference between tooth models
US7357634B2 (en) 2004-11-05 2008-04-15 Align Technology, Inc. Systems and methods for substituting virtual dental appliances
US20080306724A1 (en) 2007-06-08 2008-12-11 Align Technology, Inc. Treatment planning and progress tracking systems and methods
US20090098502A1 (en) * 2006-02-28 2009-04-16 Ormco Corporation Software and Methods for Dental Treatment Planning
US7555403B2 (en) 2005-07-15 2009-06-30 Cadent Ltd. Method for manipulating a dental virtual model, method for creating physical entities based on a dental virtual model thus manipulated, and dental models thus created
US7637740B2 (en) 2004-02-27 2009-12-29 Align Technology, Inc. Systems and methods for temporally staging teeth
US20100009308A1 (en) 2006-05-05 2010-01-14 Align Technology, Inc. Visualizing and Manipulating Digital Models for Dental Treatment
US20100068676A1 (en) 2008-09-16 2010-03-18 David Mason Dental condition evaluation and treatment
US20100068672A1 (en) 2008-09-16 2010-03-18 Hossein Arjomand Orthodontic condition evaluation
US7689398B2 (en) 2006-08-30 2010-03-30 Align Technology, Inc. System and method for modeling and application of interproximal reduction of teeth
US20100092907A1 (en) 2008-10-10 2010-04-15 Align Technology, Inc. Method And System For Deriving A Common Coordinate System For Virtual Orthodontic Brackets
US20100129762A1 (en) * 2008-11-24 2010-05-27 Align Technology, Inc. Dental appliance with simulated teeth and method for making
US7746339B2 (en) 2006-07-14 2010-06-29 Align Technology, Inc. System and method for automatic detection of dental features
US20100167243A1 (en) 2008-12-31 2010-07-01 Anton Spiridonov System and method for automatic construction of realistic looking tooth roots
US7844356B2 (en) 2006-07-19 2010-11-30 Align Technology, Inc. System and method for automatic construction of orthodontic reference objects
US7844429B2 (en) 2006-07-19 2010-11-30 Align Technology, Inc. System and method for three-dimensional complete tooth modeling
US7865259B2 (en) 2007-12-06 2011-01-04 Align Technology, Inc. System and method for improved dental geometry representation
US7878804B2 (en) 2007-02-28 2011-02-01 Align Technology, Inc. Tracking teeth movement correction
US7880751B2 (en) 2004-02-27 2011-02-01 Align Technology, Inc. Method and system for providing dynamic orthodontic assessment and treatment profiles
US7904308B2 (en) 2006-04-18 2011-03-08 Align Technology, Inc. Method and system for providing indexing and cataloguing of orthodontic related treatment profiles and options
US7930189B2 (en) 2004-02-27 2011-04-19 Align Technology, Inc. Method and system for providing dynamic orthodontic assessment and treatment profiles
US7942672B2 (en) 2008-02-15 2011-05-17 Align Technology, Inc. Gingiva modeling
US7970628B2 (en) 2004-02-27 2011-06-28 Align Technology, Inc. Method and system for providing dynamic orthodontic assessment and treatment profiles
US7970627B2 (en) 2004-02-27 2011-06-28 Align Technology, Inc. Method and system for providing dynamic orthodontic assessment and treatment profiles
US8038444B2 (en) 2006-08-30 2011-10-18 Align Technology, Inc. Automated treatment staging for teeth
US8044954B2 (en) 2006-09-22 2011-10-25 Align Technology, Inc. System and method for automatic construction of tooth axes
US8075306B2 (en) 2007-06-08 2011-12-13 Align Technology, Inc. System and method for detecting deviations during the course of an orthodontic treatment to gradually reposition teeth
US8092215B2 (en) 2008-05-23 2012-01-10 Align Technology, Inc. Smile designer
US8108189B2 (en) 2008-03-25 2012-01-31 Align Technologies, Inc. Reconstruction of non-visible part of tooth
US8126726B2 (en) 2004-02-27 2012-02-28 Align Technology, Inc. System and method for facilitating automated dental measurements and diagnostics
US8260591B2 (en) 2004-04-29 2012-09-04 Align Technology, Inc. Dynamically specifying a view
US8275180B2 (en) 2007-08-02 2012-09-25 Align Technology, Inc. Mapping abnormal dental references
US8401826B2 (en) 2006-12-22 2013-03-19 Align Technology, Inc. System and method for representation, modeling and application of three-dimensional digital pontics
US8439672B2 (en) 2008-01-29 2013-05-14 Align Technology, Inc. Method and system for optimizing dental aligner geometry
US20130204599A1 (en) 2012-02-02 2013-08-08 Align Technology, Inc. Virtually testing force placed on a tooth
US8562338B2 (en) 2007-06-08 2013-10-22 Align Technology, Inc. Treatment progress tracking and recalibration
US8591225B2 (en) 2008-12-12 2013-11-26 Align Technology, Inc. Tooth movement measurement by automatic impression matching
US8788285B2 (en) 2007-08-02 2014-07-22 Align Technology, Inc. Clinical data file
US8843381B2 (en) 2006-04-18 2014-09-23 Align Technology, Inc. Automated method and system for case matching assessment based on geometrical evaluation of stages in treatment plan
US20140294273A1 (en) * 2011-08-31 2014-10-02 Maxime Jaisson Method for designing an orthodontic appliance
US8874452B2 (en) * 2004-02-27 2014-10-28 Align Technology, Inc. Method and system for providing dynamic orthodontic assessment and treatment profiles
US8896592B2 (en) 2009-08-21 2014-11-25 Align Technology, Inc. Digital dental modeling
US20150025907A1 (en) * 2000-04-25 2015-01-22 Align Technology, Inc. Treatment analysis systems and methods
US9037439B2 (en) 2011-05-13 2015-05-19 Align Technology, Inc. Prioritization of three dimensional dental elements
US9060829B2 (en) 2007-06-08 2015-06-23 Align Technology, Inc. Systems and method for management and delivery of orthodontic treatment
US9125709B2 (en) 2011-07-29 2015-09-08 Align Technology, Inc. Systems and methods for tracking teeth movement during orthodontic treatment
US9211166B2 (en) 2010-04-30 2015-12-15 Align Technology, Inc. Individualized orthodontic treatment index
US9220580B2 (en) 2012-03-01 2015-12-29 Align Technology, Inc. Determining a dental treatment difficulty
US20160135925A1 (en) * 2014-11-13 2016-05-19 Align Technology, Inc. Method for tracking, predicting, and proactively correcting malocclusion and related issues
US9364296B2 (en) 2012-11-19 2016-06-14 Align Technology, Inc. Filling undercut areas of teeth relative to axes of appliance placement
US9375300B2 (en) 2012-02-02 2016-06-28 Align Technology, Inc. Identifying forces on a tooth
US9414897B2 (en) 2012-05-22 2016-08-16 Align Technology, Inc. Adjustment of tooth position in a virtual dental model
US20160242870A1 (en) 2015-02-23 2016-08-25 Align Technology, Inc. Method to manufacture aligner by modifying tooth position
US20160287354A1 (en) * 2015-04-06 2016-10-06 Smarter Alloys Inc. Systems and methods for orthodontic archwires for malocclusions
US20160310235A1 (en) 2015-04-24 2016-10-27 Align Technology, Inc. Comparative orthodontic treatment planning tool
US9642678B2 (en) * 2008-12-30 2017-05-09 Align Technology, Inc. Method and system for dental visualization
US9675428B2 (en) * 2013-07-12 2017-06-13 Carestream Health, Inc. Video-based auto-capture for dental surface imaging apparatus
US20180168775A1 (en) 2016-12-20 2018-06-21 Align Technology, Inc. Matching assets in 3d treatment plans
US20180263731A1 (en) 2017-03-20 2018-09-20 Align Technology, Inc. Generating a virtual depiction of an orthodontic treatment of a patient
US20180280118A1 (en) 2017-03-27 2018-10-04 Align Technology, Inc. Apparatuses and methods assisting in dental therapies
US10143536B2 (en) * 2012-10-31 2018-12-04 Ormco Corporation Computational device for an orthodontic appliance for generating an aesthetic smile
US20190029784A1 (en) 2017-07-27 2019-01-31 Align Technology, Inc. Tooth shading, transparency and glazing
US20190053876A1 (en) 2017-08-17 2019-02-21 Align Technology, Inc. Systems, methods, and apparatus for correcting malocclusions of teeth
US20190076214A1 (en) 2017-08-15 2019-03-14 Align Technology, Inc. Buccal corridor assessment and computation
US10248883B2 (en) 2015-08-20 2019-04-02 Align Technology, Inc. Photograph-based assessment of dental treatments and procedures
US20190105127A1 (en) 2017-10-05 2019-04-11 Align Technology, Inc. Virtual fillers
US20190180443A1 (en) 2017-11-07 2019-06-13 Align Technology, Inc. Deep learning for tooth detection and evaluation
US20190175303A1 (en) * 2017-11-01 2019-06-13 Align Technology, Inc. Automatic treatment planning
US20190192259A1 (en) 2017-12-15 2019-06-27 Align Technology, Inc. Closed loop adaptive orthodontic treatment methods and apparatuses
US10368719B2 (en) * 2013-03-14 2019-08-06 Ormco Corporation Registering shape data extracted from intra-oral imagery to digital reconstruction of teeth for determining position and orientation of roots
US10390913B2 (en) * 2018-01-26 2019-08-27 Align Technology, Inc. Diagnostic intraoral scanning
US20190328487A1 (en) 2018-04-30 2019-10-31 Align Technology, Inc. Systems and methods for treatment using domain-specific treatment protocols
US10463452B2 (en) 2016-08-24 2019-11-05 Align Technology, Inc. Method to visualize and manufacture aligner by modifying tooth position
US20190343601A1 (en) 2018-05-08 2019-11-14 Align Technology, Inc. Automatic ectopic teeth detection on scan
US20190350680A1 (en) 2018-05-21 2019-11-21 Align Technology, Inc. Photo realistic rendering of smile image after treatment
US20190357997A1 (en) 2018-05-22 2019-11-28 Align Technology, Inc. Tooth segmentation based on anatomical edge information
US10509838B2 (en) * 2016-07-27 2019-12-17 Align Technology, Inc. Methods and apparatuses for forming a three-dimensional volumetric model of a subject's teeth
US10507087B2 (en) * 2016-07-27 2019-12-17 Align Technology, Inc. Methods and apparatuses for forming a three-dimensional volumetric model of a subject's teeth
US10517482B2 (en) * 2017-07-27 2019-12-31 Align Technology, Inc. Optical coherence tomography for orthodontic aligners
US20200000551A1 (en) 2018-06-29 2020-01-02 Align Technology, Inc. Providing a simulated outcome of dental treatment on a patient
US20200004402A1 (en) * 2018-06-29 2020-01-02 Align Technology, Inc. Visualization of teeth
US20200000554A1 (en) * 2018-06-29 2020-01-02 Align Technology, Inc. Dental arch width measurement tool
US20200000555A1 (en) 2018-06-29 2020-01-02 Align Technology, Inc. Visualization of clinical orthodontic assets and occlusion contact shape
US20200000552A1 (en) * 2018-06-29 2020-01-02 Align Technology, Inc. Photo of a patient with new simulated smile in an orthodontic treatment review software
US10537405B2 (en) * 2014-11-13 2020-01-21 Align Technology, Inc. Dental appliance with cavity for an unerupted or erupting tooth
US10568716B2 (en) * 2010-03-17 2020-02-25 ClearCorrect Holdings, Inc. Methods and systems for employing artificial intelligence in automated orthodontic diagnosis and treatment planning
US20200085546A1 (en) 2018-09-14 2020-03-19 Align Technology, Inc. Machine learning scoring system and methods for tooth position assessment
US10595965B2 (en) * 2012-03-01 2020-03-24 Align Technology, Inc. Interproximal reduction planning
US10595966B2 (en) 2016-11-04 2020-03-24 Align Technology, Inc. Methods and apparatuses for dental images
US20200105028A1 (en) 2018-09-28 2020-04-02 Align Technology, Inc. Generic framework for blurring of colors for teeth in generated images using height map
US20200107915A1 (en) 2018-10-04 2020-04-09 Align Technology, Inc. Molar trimming prediction and validation using machine learning
US10617489B2 (en) 2012-12-19 2020-04-14 Align Technology, Inc. Creating a digital dental model of a patient's teeth using interproximal information
US20200113649A1 (en) * 2018-10-12 2020-04-16 Laonpeople Inc. Apparatus and method for generating image of corrected teeth
US10631956B1 (en) * 2019-12-04 2020-04-28 Oxilio Ltd Methods and systems for making an orthodontic aligner having fixing blocks
US10631954B1 (en) * 2019-12-04 2020-04-28 Oxilio Ltd Systems and methods for determining orthodontic treatments
US20200155274A1 (en) * 2018-11-16 2020-05-21 Align Technology, Inc. Dental analysis with missing teeth prediction
US20200214800A1 (en) 2019-01-03 2020-07-09 Align Technology, Inc. Systems and methods for nonlinear tooth modeling
US10779718B2 (en) 2017-02-13 2020-09-22 Align Technology, Inc. Cheek retractor and mobile device holder
US20200297458A1 (en) 2019-03-21 2020-09-24 Align Technology, Inc. Automatic application of doctor's preferences workflow using statistical preference analysis
US20200306012A1 (en) 2019-03-29 2020-10-01 Align Technology, Inc. Segmentation quality assessment
US20200306011A1 (en) 2019-03-25 2020-10-01 Align Technology, Inc. Prediction of multiple treatment settings
US10792127B2 (en) 2017-01-24 2020-10-06 Align Technology, Inc. Adaptive orthodontic treatment
US20200315744A1 (en) 2019-04-03 2020-10-08 Align Technology, Inc. Dental arch analysis and tooth numbering
US10835349B2 (en) 2018-07-20 2020-11-17 Align Technology, Inc. Parametric blurring of colors for teeth in generated images
US20200360109A1 (en) 2019-05-14 2020-11-19 Align Technology, Inc. Visual presentation of gingival line generated based on 3d tooth model
US20200397304A1 (en) * 2014-05-07 2020-12-24 Align Technology, Inc. Caries detection using intraoral scan data

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6409504B1 (en) 1997-06-20 2002-06-25 Align Technology, Inc. Manipulating a digital dentition model to form models of individual dentition components
US7063532B1 (en) 1997-06-20 2006-06-20 Align Technology, Inc. Subdividing a digital dentition model
AU744385B2 (en) 1997-06-20 2002-02-21 Align Technology, Inc. Method and system for incrementally moving teeth
IL122807A0 (en) 1997-12-30 1998-08-16 Cadent Ltd Virtual orthodontic treatment
US7108508B2 (en) 1998-12-04 2006-09-19 Align Technology, Inc. Manipulable dental model system for fabrication of a dental appliance
US6602070B2 (en) 1999-05-13 2003-08-05 Align Technology, Inc. Systems and methods for dental treatment planning
AU2001249765A1 (en) 2000-03-30 2001-10-15 Align Technology, Inc. System and method for separating three-dimensional models
US7040896B2 (en) 2000-08-16 2006-05-09 Align Technology, Inc. Systems and methods for removing gingiva from computer tooth models
US7736147B2 (en) 2000-10-30 2010-06-15 Align Technology, Inc. Systems and methods for bite-setting teeth models
US7160107B2 (en) 2002-05-02 2007-01-09 Cadent Ltd. Method and system for assessing the outcome of an orthodontic treatment
US7156661B2 (en) 2002-08-22 2007-01-02 Align Technology, Inc. Systems and methods for treatment analysis by teeth matching
US20040197728A1 (en) 2002-09-10 2004-10-07 Amir Abolfathi Architecture for treating teeth
US7241142B2 (en) 2004-03-19 2007-07-10 Align Technology, Inc. Root-based tooth moving sequencing
US8099268B2 (en) 2007-05-25 2012-01-17 Align Technology, Inc. Tooth modeling
US10342638B2 (en) 2007-06-08 2019-07-09 Align Technology, Inc. Treatment planning and progress tracking systems and methods
CN105769352B (en) * 2014-12-23 2020-06-16 无锡时代天使医疗器械科技有限公司 Direct step-by-step method for producing orthodontic conditions
US20170273760A1 (en) 2016-03-28 2017-09-28 Align Technology, Inc. Systems, methods, and devices for predictable orthodontic treatment
US10722328B2 (en) 2017-10-05 2020-07-28 Align Technology, Inc. Virtual fillers for virtual models of dental arches
WO2020141366A1 (en) * 2018-12-31 2020-07-09 3M Innovative Properties Company Combining data from multiple dental anatomy scans
AU2020271096A1 (en) * 2019-04-11 2021-10-28 Candid Care Co. Dental aligners and procedures for aligning teeth
US11232573B2 (en) * 2019-09-05 2022-01-25 Align Technology, Inc. Artificially intelligent systems to manage virtual dental models using dental images
US12106845B2 (en) 2019-11-05 2024-10-01 Align Technology, Inc. Clinically relevant anonymization of photos and video
US11810271B2 (en) 2019-12-04 2023-11-07 Align Technology, Inc. Domain specific image quality assessment
US11622836B2 (en) * 2019-12-31 2023-04-11 Align Technology, Inc. Aligner stage analysis to obtain mechanical interactions of aligners and teeth for treatment planning

Patent Citations (145)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5975893A (en) * 1997-06-20 1999-11-02 Align Technology, Inc. Method and system for incrementally moving teeth
US6227851B1 (en) 1998-12-04 2001-05-08 Align Technology, Inc. Manipulable dental model system for fabrication of a dental appliance
US6299440B1 (en) 1999-01-15 2001-10-09 Align Technology, Inc System and method for producing tooth movement
US6227850B1 (en) 1999-05-13 2001-05-08 Align Technology, Inc. Teeth viewing system
US6318994B1 (en) 1999-05-13 2001-11-20 Align Technology, Inc Tooth path treatment plan
US6406292B1 (en) 1999-05-13 2002-06-18 Align Technology, Inc. System for determining final position of teeth
US6514074B1 (en) 1999-05-14 2003-02-04 Align Technology, Inc. Digitally modeling the deformation of gingival
US6471512B1 (en) * 1999-11-30 2002-10-29 Ora Metrix, Inc. Method and apparatus for determining and monitoring orthodontic treatment
US20030009252A1 (en) * 2000-02-17 2003-01-09 Align Technology, Inc. Efficient data representation of teeth model
US6371761B1 (en) 2000-03-30 2002-04-16 Align Technology, Inc. Flexible plane for separating teeth models
US6488499B1 (en) 2000-04-25 2002-12-03 Align Technology, Inc. Methods for correcting deviations in preplanned tooth rearrangements
US6582229B1 (en) 2000-04-25 2003-06-24 Align Technology, Inc. Methods for modeling bite registration
US20150025907A1 (en) * 2000-04-25 2015-01-22 Align Technology, Inc. Treatment analysis systems and methods
US6621491B1 (en) 2000-04-27 2003-09-16 Align Technology, Inc. Systems and methods for integrating 3D diagnostic data
US6386878B1 (en) 2000-08-16 2002-05-14 Align Technology, Inc. Systems and methods for removing gingiva from teeth
US6726478B1 (en) 2000-10-30 2004-04-27 Align Technology, Inc. Systems and methods for bite-setting teeth models
US6783360B2 (en) 2000-12-13 2004-08-31 Align Technology, Inc. Systems and methods for positioning teeth
US7074038B1 (en) 2000-12-29 2006-07-11 Align Technology, Inc. Methods and systems for treating teeth
US6767208B2 (en) 2002-01-10 2004-07-27 Align Technology, Inc. System and method for positioning teeth
US20030143509A1 (en) 2002-01-29 2003-07-31 Cadent, Ltd. Method and system for assisting in applying an orthodontic treatment
US7074039B2 (en) 2002-05-02 2006-07-11 Cadent Ltd. Method and system for assessing the outcome of an orthodontic treatment
US20030207227A1 (en) 2002-05-02 2003-11-06 Align Technology, Inc. Systems and methods for treating patients
US7077647B2 (en) 2002-08-22 2006-07-18 Align Technology, Inc. Systems and methods for treatment analysis by teeth matching
US20040152036A1 (en) 2002-09-10 2004-08-05 Amir Abolfathi Architecture for treating teeth
US20040259049A1 (en) 2003-06-17 2004-12-23 Avi Kopelman Method and system for selecting orthodontic appliances
US20050182654A1 (en) 2004-02-14 2005-08-18 Align Technology, Inc. Systems and methods for providing treatment planning
US7970627B2 (en) 2004-02-27 2011-06-28 Align Technology, Inc. Method and system for providing dynamic orthodontic assessment and treatment profiles
US7970628B2 (en) 2004-02-27 2011-06-28 Align Technology, Inc. Method and system for providing dynamic orthodontic assessment and treatment profiles
US7930189B2 (en) 2004-02-27 2011-04-19 Align Technology, Inc. Method and system for providing dynamic orthodontic assessment and treatment profiles
US7880751B2 (en) 2004-02-27 2011-02-01 Align Technology, Inc. Method and system for providing dynamic orthodontic assessment and treatment profiles
US8126726B2 (en) 2004-02-27 2012-02-28 Align Technology, Inc. System and method for facilitating automated dental measurements and diagnostics
US20070238065A1 (en) * 2004-02-27 2007-10-11 Align Technology, Inc. Method and System for Providing Dynamic Orthodontic Assessment and Treatment Profiles
US8874452B2 (en) * 2004-02-27 2014-10-28 Align Technology, Inc. Method and system for providing dynamic orthodontic assessment and treatment profiles
US10653502B2 (en) * 2004-02-27 2020-05-19 Align Technology, Inc. Method and system for providing dynamic orthodontic assessment and treatment profiles
US7637740B2 (en) 2004-02-27 2009-12-29 Align Technology, Inc. Systems and methods for temporally staging teeth
US20050244791A1 (en) 2004-04-29 2005-11-03 Align Technology, Inc. Interproximal reduction treatment planning
US8260591B2 (en) 2004-04-29 2012-09-04 Align Technology, Inc. Dynamically specifying a view
US7357634B2 (en) 2004-11-05 2008-04-15 Align Technology, Inc. Systems and methods for substituting virtual dental appliances
US7309230B2 (en) 2004-12-14 2007-12-18 Align Technology, Inc. Preventing interference between tooth models
US7293988B2 (en) 2004-12-14 2007-11-13 Align Technology, Inc. Accurately predicting and preventing interference between tooth models
US20060127854A1 (en) 2004-12-14 2006-06-15 Huafeng Wen Image based dentition record digitization
US20060127836A1 (en) 2004-12-14 2006-06-15 Huafeng Wen Tooth movement tracking system
US20060127852A1 (en) 2004-12-14 2006-06-15 Huafeng Wen Image based orthodontic treatment viewing system
US20060275736A1 (en) 2005-04-22 2006-12-07 Orthoclear Holdings, Inc. Computer aided orthodontic treatment planning
US20060275731A1 (en) 2005-04-29 2006-12-07 Orthoclear Holdings, Inc. Treatment of teeth by aligners
US7555403B2 (en) 2005-07-15 2009-06-30 Cadent Ltd. Method for manipulating a dental virtual model, method for creating physical entities based on a dental virtual model thus manipulated, and dental models thus created
US20090098502A1 (en) * 2006-02-28 2009-04-16 Ormco Corporation Software and Methods for Dental Treatment Planning
US8843381B2 (en) 2006-04-18 2014-09-23 Align Technology, Inc. Automated method and system for case matching assessment based on geometrical evaluation of stages in treatment plan
US7904308B2 (en) 2006-04-18 2011-03-08 Align Technology, Inc. Method and system for providing indexing and cataloguing of orthodontic related treatment profiles and options
US20100009308A1 (en) 2006-05-05 2010-01-14 Align Technology, Inc. Visualizing and Manipulating Digital Models for Dental Treatment
US7746339B2 (en) 2006-07-14 2010-06-29 Align Technology, Inc. System and method for automatic detection of dental features
US7844429B2 (en) 2006-07-19 2010-11-30 Align Technology, Inc. System and method for three-dimensional complete tooth modeling
US7844356B2 (en) 2006-07-19 2010-11-30 Align Technology, Inc. System and method for automatic construction of orthodontic reference objects
US8038444B2 (en) 2006-08-30 2011-10-18 Align Technology, Inc. Automated treatment staging for teeth
US7689398B2 (en) 2006-08-30 2010-03-30 Align Technology, Inc. System and method for modeling and application of interproximal reduction of teeth
US8044954B2 (en) 2006-09-22 2011-10-25 Align Technology, Inc. System and method for automatic construction of tooth axes
US8401826B2 (en) 2006-12-22 2013-03-19 Align Technology, Inc. System and method for representation, modeling and application of three-dimensional digital pontics
US7878804B2 (en) 2007-02-28 2011-02-01 Align Technology, Inc. Tracking teeth movement correction
US9060829B2 (en) 2007-06-08 2015-06-23 Align Technology, Inc. Systems and method for management and delivery of orthodontic treatment
US20080306724A1 (en) 2007-06-08 2008-12-11 Align Technology, Inc. Treatment planning and progress tracking systems and methods
US8075306B2 (en) 2007-06-08 2011-12-13 Align Technology, Inc. System and method for detecting deviations during the course of an orthodontic treatment to gradually reposition teeth
US20140335466A1 (en) * 2007-06-08 2014-11-13 Align Technology, Inc. Treatment progress tracking and recalibration
US8562338B2 (en) 2007-06-08 2013-10-22 Align Technology, Inc. Treatment progress tracking and recalibration
US8788285B2 (en) 2007-08-02 2014-07-22 Align Technology, Inc. Clinical data file
US8275180B2 (en) 2007-08-02 2012-09-25 Align Technology, Inc. Mapping abnormal dental references
US7865259B2 (en) 2007-12-06 2011-01-04 Align Technology, Inc. System and method for improved dental geometry representation
US8439672B2 (en) 2008-01-29 2013-05-14 Align Technology, Inc. Method and system for optimizing dental aligner geometry
US7942672B2 (en) 2008-02-15 2011-05-17 Align Technology, Inc. Gingiva modeling
US8108189B2 (en) 2008-03-25 2012-01-31 Align Technologies, Inc. Reconstruction of non-visible part of tooth
US8092215B2 (en) 2008-05-23 2012-01-10 Align Technology, Inc. Smile designer
US20100068672A1 (en) 2008-09-16 2010-03-18 Hossein Arjomand Orthodontic condition evaluation
US20100068676A1 (en) 2008-09-16 2010-03-18 David Mason Dental condition evaluation and treatment
US20100092907A1 (en) 2008-10-10 2010-04-15 Align Technology, Inc. Method And System For Deriving A Common Coordinate System For Virtual Orthodontic Brackets
US20100129762A1 (en) * 2008-11-24 2010-05-27 Align Technology, Inc. Dental appliance with simulated teeth and method for making
US8591225B2 (en) 2008-12-12 2013-11-26 Align Technology, Inc. Tooth movement measurement by automatic impression matching
US9642678B2 (en) * 2008-12-30 2017-05-09 Align Technology, Inc. Method and system for dental visualization
US20100167243A1 (en) 2008-12-31 2010-07-01 Anton Spiridonov System and method for automatic construction of realistic looking tooth roots
US8896592B2 (en) 2009-08-21 2014-11-25 Align Technology, Inc. Digital dental modeling
US10568716B2 (en) * 2010-03-17 2020-02-25 ClearCorrect Holdings, Inc. Methods and systems for employing artificial intelligence in automated orthodontic diagnosis and treatment planning
US9211166B2 (en) 2010-04-30 2015-12-15 Align Technology, Inc. Individualized orthodontic treatment index
US9037439B2 (en) 2011-05-13 2015-05-19 Align Technology, Inc. Prioritization of three dimensional dental elements
US9125709B2 (en) 2011-07-29 2015-09-08 Align Technology, Inc. Systems and methods for tracking teeth movement during orthodontic treatment
US20140294273A1 (en) * 2011-08-31 2014-10-02 Maxime Jaisson Method for designing an orthodontic appliance
US20130204599A1 (en) 2012-02-02 2013-08-08 Align Technology, Inc. Virtually testing force placed on a tooth
US9375300B2 (en) 2012-02-02 2016-06-28 Align Technology, Inc. Identifying forces on a tooth
US9220580B2 (en) 2012-03-01 2015-12-29 Align Technology, Inc. Determining a dental treatment difficulty
US10595965B2 (en) * 2012-03-01 2020-03-24 Align Technology, Inc. Interproximal reduction planning
US9414897B2 (en) 2012-05-22 2016-08-16 Align Technology, Inc. Adjustment of tooth position in a virtual dental model
US10143536B2 (en) * 2012-10-31 2018-12-04 Ormco Corporation Computational device for an orthodontic appliance for generating an aesthetic smile
US9364296B2 (en) 2012-11-19 2016-06-14 Align Technology, Inc. Filling undercut areas of teeth relative to axes of appliance placement
US10617489B2 (en) 2012-12-19 2020-04-14 Align Technology, Inc. Creating a digital dental model of a patient's teeth using interproximal information
US10368719B2 (en) * 2013-03-14 2019-08-06 Ormco Corporation Registering shape data extracted from intra-oral imagery to digital reconstruction of teeth for determining position and orientation of roots
US9675428B2 (en) * 2013-07-12 2017-06-13 Carestream Health, Inc. Video-based auto-capture for dental surface imaging apparatus
US20200397304A1 (en) * 2014-05-07 2020-12-24 Align Technology, Inc. Caries detection using intraoral scan data
US10537405B2 (en) * 2014-11-13 2020-01-21 Align Technology, Inc. Dental appliance with cavity for an unerupted or erupting tooth
US20160135925A1 (en) * 2014-11-13 2016-05-19 Align Technology, Inc. Method for tracking, predicting, and proactively correcting malocclusion and related issues
US20160242870A1 (en) 2015-02-23 2016-08-25 Align Technology, Inc. Method to manufacture aligner by modifying tooth position
US20160287354A1 (en) * 2015-04-06 2016-10-06 Smarter Alloys Inc. Systems and methods for orthodontic archwires for malocclusions
US20160310235A1 (en) 2015-04-24 2016-10-27 Align Technology, Inc. Comparative orthodontic treatment planning tool
US10248883B2 (en) 2015-08-20 2019-04-02 Align Technology, Inc. Photograph-based assessment of dental treatments and procedures
US10507087B2 (en) * 2016-07-27 2019-12-17 Align Technology, Inc. Methods and apparatuses for forming a three-dimensional volumetric model of a subject's teeth
US10509838B2 (en) * 2016-07-27 2019-12-17 Align Technology, Inc. Methods and apparatuses for forming a three-dimensional volumetric model of a subject's teeth
US10463452B2 (en) 2016-08-24 2019-11-05 Align Technology, Inc. Method to visualize and manufacture aligner by modifying tooth position
US10595966B2 (en) 2016-11-04 2020-03-24 Align Technology, Inc. Methods and apparatuses for dental images
US20180168775A1 (en) 2016-12-20 2018-06-21 Align Technology, Inc. Matching assets in 3d treatment plans
US10792127B2 (en) 2017-01-24 2020-10-06 Align Technology, Inc. Adaptive orthodontic treatment
US10779718B2 (en) 2017-02-13 2020-09-22 Align Technology, Inc. Cheek retractor and mobile device holder
US20180263731A1 (en) 2017-03-20 2018-09-20 Align Technology, Inc. Generating a virtual depiction of an orthodontic treatment of a patient
US10758322B2 (en) 2017-03-20 2020-09-01 Align Technology, Inc. Virtually representing an orthodontic treatment outcome using automated detection of facial and dental reference objects
US10828130B2 (en) 2017-03-20 2020-11-10 Align Technology, Inc. Automated 2D/3D integration and lip spline autoplacement
US20180280118A1 (en) 2017-03-27 2018-10-04 Align Technology, Inc. Apparatuses and methods assisting in dental therapies
US20190029784A1 (en) 2017-07-27 2019-01-31 Align Technology, Inc. Tooth shading, transparency and glazing
US10517482B2 (en) * 2017-07-27 2019-12-31 Align Technology, Inc. Optical coherence tomography for orthodontic aligners
US20190076214A1 (en) 2017-08-15 2019-03-14 Align Technology, Inc. Buccal corridor assessment and computation
US20190053876A1 (en) 2017-08-17 2019-02-21 Align Technology, Inc. Systems, methods, and apparatus for correcting malocclusions of teeth
US20190105127A1 (en) 2017-10-05 2019-04-11 Align Technology, Inc. Virtual fillers
US20190175303A1 (en) * 2017-11-01 2019-06-13 Align Technology, Inc. Automatic treatment planning
US20190180443A1 (en) 2017-11-07 2019-06-13 Align Technology, Inc. Deep learning for tooth detection and evaluation
US20190192259A1 (en) 2017-12-15 2019-06-27 Align Technology, Inc. Closed loop adaptive orthodontic treatment methods and apparatuses
US10390913B2 (en) * 2018-01-26 2019-08-27 Align Technology, Inc. Diagnostic intraoral scanning
US20190333622A1 (en) 2018-04-30 2019-10-31 Align Technology, Inc. Systems and methods for treatment using domain-specific treatment protocols
US20190328488A1 (en) 2018-04-30 2019-10-31 Align Technology, Inc. Systems and methods for treatment using domain-specific treatment protocols
US20190328487A1 (en) 2018-04-30 2019-10-31 Align Technology, Inc. Systems and methods for treatment using domain-specific treatment protocols
US20190343601A1 (en) 2018-05-08 2019-11-14 Align Technology, Inc. Automatic ectopic teeth detection on scan
US20190350680A1 (en) 2018-05-21 2019-11-21 Align Technology, Inc. Photo realistic rendering of smile image after treatment
US20190357997A1 (en) 2018-05-22 2019-11-28 Align Technology, Inc. Tooth segmentation based on anatomical edge information
US20200000555A1 (en) 2018-06-29 2020-01-02 Align Technology, Inc. Visualization of clinical orthodontic assets and occlusion contact shape
US20200000554A1 (en) * 2018-06-29 2020-01-02 Align Technology, Inc. Dental arch width measurement tool
US20200000551A1 (en) 2018-06-29 2020-01-02 Align Technology, Inc. Providing a simulated outcome of dental treatment on a patient
US20200004402A1 (en) * 2018-06-29 2020-01-02 Align Technology, Inc. Visualization of teeth
US20200000552A1 (en) * 2018-06-29 2020-01-02 Align Technology, Inc. Photo of a patient with new simulated smile in an orthodontic treatment review software
US10835349B2 (en) 2018-07-20 2020-11-17 Align Technology, Inc. Parametric blurring of colors for teeth in generated images
US20200085546A1 (en) 2018-09-14 2020-03-19 Align Technology, Inc. Machine learning scoring system and methods for tooth position assessment
US20200105028A1 (en) 2018-09-28 2020-04-02 Align Technology, Inc. Generic framework for blurring of colors for teeth in generated images using height map
US20200107915A1 (en) 2018-10-04 2020-04-09 Align Technology, Inc. Molar trimming prediction and validation using machine learning
US20200113649A1 (en) * 2018-10-12 2020-04-16 Laonpeople Inc. Apparatus and method for generating image of corrected teeth
US20200155274A1 (en) * 2018-11-16 2020-05-21 Align Technology, Inc. Dental analysis with missing teeth prediction
US20200214800A1 (en) 2019-01-03 2020-07-09 Align Technology, Inc. Systems and methods for nonlinear tooth modeling
US20200297458A1 (en) 2019-03-21 2020-09-24 Align Technology, Inc. Automatic application of doctor's preferences workflow using statistical preference analysis
US20200306011A1 (en) 2019-03-25 2020-10-01 Align Technology, Inc. Prediction of multiple treatment settings
US20200306012A1 (en) 2019-03-29 2020-10-01 Align Technology, Inc. Segmentation quality assessment
US20200315744A1 (en) 2019-04-03 2020-10-08 Align Technology, Inc. Dental arch analysis and tooth numbering
US20200360109A1 (en) 2019-05-14 2020-11-19 Align Technology, Inc. Visual presentation of gingival line generated based on 3d tooth model
US10631956B1 (en) * 2019-12-04 2020-04-28 Oxilio Ltd Methods and systems for making an orthodontic aligner having fixing blocks
US10631954B1 (en) * 2019-12-04 2020-04-28 Oxilio Ltd Systems and methods for determining orthodontic treatments

Cited By (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11950977B2 (en) 2006-08-30 2024-04-09 Align Technology, Inc. Methods for schedule of movement modifications in orthodontic treatment
US11717381B2 (en) 2006-08-30 2023-08-08 Align Technology, Inc. Methods for tooth collision detection and avoidance in orthodontic treament
US11766311B2 (en) 2007-06-08 2023-09-26 Align Technology, Inc. Treatment progress tracking and recalibration
US11819377B2 (en) 2007-06-08 2023-11-21 Align Technology, Inc. Generating 3D models of a patient's teeth based on 2D teeth images
US11737852B2 (en) 2008-03-25 2023-08-29 Align Technology, Inc. Computer-implemented method of smoothing a shape of a tooth model
US11417432B2 (en) 2008-05-23 2022-08-16 Align Technology, Inc. Smile designer
US11232867B2 (en) 2008-05-23 2022-01-25 Align Technology, Inc. Smile designer
US11883255B2 (en) 2008-12-30 2024-01-30 Align Technology, Inc. Method and system for dental visualization
US11376100B2 (en) 2009-08-21 2022-07-05 Align Technology, Inc. Digital dental modeling
US12109089B2 (en) 2010-04-30 2024-10-08 Align Technology, Inc. Individualized orthodontic treatment index
US11864969B2 (en) 2011-05-13 2024-01-09 Align Technology, Inc. Prioritization of three dimensional dental elements
US12181857B2 (en) 2011-07-29 2024-12-31 Align Technology, Inc. Systems and methods for tracking teeth movement during orthodontic treatment
US11986369B2 (en) 2012-03-01 2024-05-21 Align Technology, Inc. Methods and systems for determining a dental treatment difficulty in digital treatment planning
US11678954B2 (en) 2012-05-22 2023-06-20 Align Technology, Inc. Adjustment of tooth position in a virtual dental model
US12279925B2 (en) 2012-10-30 2025-04-22 University Of Southern California Orthodontic appliance with snap fitted, non- sliding archwire
US11678956B2 (en) 2012-11-19 2023-06-20 Align Technology, Inc. Filling undercut areas of teeth relative to axes of appliance placement
US11957532B2 (en) 2012-12-19 2024-04-16 Align Technology, Inc. Creating a digital dental model of a patient's teeth using interproximal information
US12048606B2 (en) 2015-02-23 2024-07-30 Align Technology, Inc. Systems for treatment planning with overcorrection
US11723749B2 (en) 2015-08-20 2023-08-15 Align Technology, Inc. Photograph-based assessment of dental treatments and procedures
US11819375B2 (en) 2016-11-04 2023-11-21 Align Technology, Inc. Methods and apparatuses for dental images
US11872102B2 (en) 2017-01-24 2024-01-16 Align Technology, Inc. Updating an orthodontic treatment plan during treatment
US11957536B2 (en) 2017-01-31 2024-04-16 Swift Health Systems Inc. Hybrid orthodontic archwires
US11805991B2 (en) 2017-02-13 2023-11-07 Align Technology, Inc. Cheek retractor and mobile device holder
US11864971B2 (en) 2017-03-20 2024-01-09 Align Technology, Inc. Generating a virtual patient depiction of an orthodontic treatment
US12220294B2 (en) 2017-04-21 2025-02-11 Swift Health Systems Inc. Indirect bonding trays, non-sliding orthodontic appliances, and registration systems for use thereof
US11998410B2 (en) 2017-07-27 2024-06-04 Align Technology, Inc. Tooth shading, transparency and glazing
US12232924B2 (en) 2017-08-15 2025-02-25 Align Technology, Inc. Buccal corridor assessment and computation
US12064310B2 (en) 2017-08-17 2024-08-20 Align Technology, Inc. Systems, methods, and apparatus for correcting malocclusions of teeth
US11992382B2 (en) 2017-10-05 2024-05-28 Align Technology, Inc. Virtual fillers for virtual models of dental arches
US12213855B2 (en) 2017-11-01 2025-02-04 Align Technology, Inc. Methods of manufacturing dental aligners
US12268569B2 (en) 2017-11-01 2025-04-08 Align Technology, Inc. Treatment planning for aligning a patient's teeth
US12213856B2 (en) 2017-11-01 2025-02-04 Align Technology, Inc. Treatment plan generation using collision detection by shape filling
US11790643B2 (en) 2017-11-07 2023-10-17 Align Technology, Inc. Deep learning for tooth detection and evaluation
US11957531B2 (en) 2017-12-15 2024-04-16 Align Technology, Inc. Orthodontic systems for monitoring treatment
US11751974B2 (en) 2018-05-08 2023-09-12 Align Technology, Inc. Automatic ectopic teeth detection on scan
US11672629B2 (en) 2018-05-21 2023-06-13 Align Technology, Inc. Photo realistic rendering of smile image after treatment
US11759291B2 (en) 2018-05-22 2023-09-19 Align Technology, Inc. Tooth segmentation based on anatomical edge information
US11801121B2 (en) 2018-06-29 2023-10-31 Align Technology, Inc. Methods for generating composite images of a patient
US11666416B2 (en) 2018-06-29 2023-06-06 Align Technology, Inc. Methods for simulating orthodontic treatment
US11395717B2 (en) 2018-06-29 2022-07-26 Align Technology, Inc. Visualization of clinical orthodontic assets and occlusion contact shape
US11464604B2 (en) 2018-06-29 2022-10-11 Align Technology, Inc. Dental arch width measurement tool
US11452577B2 (en) 2018-07-20 2022-09-27 Align Technology, Inc. Generation of synthetic post treatment images of teeth
US11534272B2 (en) 2018-09-14 2022-12-27 Align Technology, Inc. Machine learning scoring system and methods for tooth position assessment
US11842437B2 (en) 2018-09-19 2023-12-12 Align Technology, Inc. Marker-less augmented reality system for mammoplasty pre-visualization
US11151753B2 (en) 2018-09-28 2021-10-19 Align Technology, Inc. Generic framework for blurring of colors for teeth in generated images using height map
US11654001B2 (en) 2018-10-04 2023-05-23 Align Technology, Inc. Molar trimming prediction and validation using machine learning
US11771526B2 (en) 2019-01-03 2023-10-03 Align Technology, Inc. Systems and methods for nonlinear tooth modeling
US12020373B2 (en) * 2019-02-15 2024-06-25 Medit Corp. Method for replaying scanning process
US20210375031A1 (en) * 2019-02-15 2021-12-02 Medit Corp. Method for replaying scanning process
US12042354B2 (en) 2019-03-01 2024-07-23 Swift Health Systems Inc. Indirect bonding trays with bite turbo and orthodontic auxiliary integration
US12193905B2 (en) 2019-03-25 2025-01-14 Align Technology, Inc. Prediction of multiple treatment settings
US11707344B2 (en) 2019-03-29 2023-07-25 Align Technology, Inc. Segmentation quality assessment
US11357598B2 (en) 2019-04-03 2022-06-14 Align Technology, Inc. Dental arch analysis and tooth numbering
US12064311B2 (en) 2019-05-14 2024-08-20 Align Technology, Inc. Visual presentation of gingival line generated based on 3D tooth model
US12002152B2 (en) * 2019-07-09 2024-06-04 Panasonic Intellectual Property Management Co., Ltd. Three-dimensional model generation method and three-dimensional model generation device
US20220114785A1 (en) * 2019-07-09 2022-04-14 Panasonic Intellectual Property Management Co., Ltd. Three-dimensional model generation method and three-dimensional model generation device
US11651494B2 (en) 2019-09-05 2023-05-16 Align Technology, Inc. Apparatuses and methods for three-dimensional dental segmentation using dental image data
US11232573B2 (en) 2019-09-05 2022-01-25 Align Technology, Inc. Artificially intelligent systems to manage virtual dental models using dental images
US12053346B2 (en) 2019-10-31 2024-08-06 Swift Health Systems Inc. Indirect orthodontic bonding systems and methods
US12106845B2 (en) 2019-11-05 2024-10-01 Align Technology, Inc. Clinically relevant anonymization of photos and video
US12086964B2 (en) 2019-12-04 2024-09-10 Align Technology, Inc. Selective image modification based on sharpness metric and image domain
US11903793B2 (en) 2019-12-31 2024-02-20 Align Technology, Inc. Machine learning dental segmentation methods using sparse voxel representations
US12076207B2 (en) 2020-02-05 2024-09-03 Align Technology, Inc. Systems and methods for precision wing placement
US12048605B2 (en) 2020-02-11 2024-07-30 Align Technology, Inc. Tracking orthodontic treatment using teeth images
US12125581B2 (en) 2020-02-20 2024-10-22 Align Technology, Inc. Medical imaging data compression and extraction on client side
US12090025B2 (en) 2020-06-11 2024-09-17 Swift Health Systems Inc. Orthodontic appliance with non-sliding archform
US11991439B2 (en) 2020-07-23 2024-05-21 Align Technology, Inc. Systems, apparatus, and methods for remote orthodontic treatment
US11991440B2 (en) 2020-07-23 2024-05-21 Align Technology, Inc. Treatment-based image capture guidance
US11985414B2 (en) 2020-07-23 2024-05-14 Align Technology, Inc. Image-based aligner fit evaluation
US11962892B2 (en) 2020-07-23 2024-04-16 Align Technology, Inc. Image based dentition tracking
US11800216B2 (en) 2020-07-23 2023-10-24 Align Technology, Inc. Image based orthodontic treatment refinement
US11864970B2 (en) 2020-11-06 2024-01-09 Align Technology, Inc. Accurate method to determine center of resistance for 1D/2D/3D problems
US20220218438A1 (en) * 2021-01-14 2022-07-14 Orthosnap Corp. Creating three-dimensional (3d) animation
US12268571B2 (en) 2021-03-12 2025-04-08 Swift Health Systems Inc. Indirect orthodontic bonding systems and methods
USD1063077S1 (en) 2021-04-26 2025-02-18 Align Technology, Inc. Dental imaging attachment for a smartphone
US12193908B2 (en) 2021-09-03 2025-01-14 Swift Health Systems, Inc. Orthodontic appliance with non-sliding archform
US12053345B2 (en) 2021-09-03 2024-08-06 Swift Health Systems Inc. Method of administering adhesive to bond orthodontic brackets
US12220288B2 (en) 2021-10-27 2025-02-11 Align Technology, Inc. Systems and methods for orthodontic and restorative treatment planning
USD1043994S1 (en) 2022-01-06 2024-09-24 Swift Health Systems Inc. Archwire

Also Published As

Publication number Publication date
US20230004276A1 (en) 2023-01-05
US20210333978A1 (en) 2021-10-28
US12067210B2 (en) 2024-08-20
US11449191B2 (en) 2022-09-20
US20240028178A1 (en) 2024-01-25
US20240361879A1 (en) 2024-10-31
US20200004402A1 (en) 2020-01-02
US11809214B2 (en) 2023-11-07

Similar Documents

Publication Publication Date Title
US11449191B2 (en) Digital treatment planning by modeling inter-arch collisions
US11877906B2 (en) Dental arch width measurement tool
US20230320819A1 (en) Dental appliance hook placement and visualization
JP5497766B2 (en) Tools for customized design of dental restorations
US7474932B2 (en) Dental computer-aided design (CAD) methods and systems
US8469705B2 (en) Method and system for integrated orthodontic treatment planning using unified workstation
US11058524B2 (en) Dental restoration design tools
US20170079748A1 (en) Software And Methods For Dental Treatment Planning
WO2021245484A1 (en) Display of multiple automated orthodontic treatment options
CN110176056B (en) Computer-implemented method for modifying a digital three-dimensional model of dentition
US20140379356A1 (en) Method and system for integrated orthodontic treatment planning using unified workstation
US20050271996A1 (en) Method and system for comprehensive evaluation of orthodontic care using unified workstation
US20220000592A1 (en) Dental restoration design tools
US20200405445A1 (en) Orthodontic appliances, digital tools, and methods for dental treatment planning
KR102473722B1 (en) Method for providing section image of tooth and dental image processing apparatus therefor
CN116529833A (en) Digital tooth setting method and device using tooth setting graphic user interface
US20240407901A1 (en) Method to semi-automatically determine virtual dental occlusion
US20240197441A1 (en) Techniques for determining patient teeth positions for orthodontics
KR20250060073A (en) Method for providing orthodontic image and apparatus thereof
Emery III et al. Dynamic Navigation for Dental Implants
EP3666225A1 (en) Method for creating a graphic representation of a dental condition
KR20230135235A (en) Method for simulating orthodontic treatment, and apparatus implementing the same method

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: ALIGN TECHNOLOGY, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MAKARENKOVA, SVETLANA;KUANBEKOV, ARTEM;ZHULIN, ALEKSANDR;AND OTHERS;SIGNING DATES FROM 20190703 TO 20190705;REEL/FRAME:049981/0019

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4