• Login
    View Item 
    •   DSpace Home
    • SUNY Brockport
    • Events/Conferences
    • 2015 SUNY Undergraduate Research Conference
    • View Item
    •   DSpace Home
    • SUNY Brockport
    • Events/Conferences
    • 2015 SUNY Undergraduate Research Conference
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsDepartmentThis CollectionBy Issue DateAuthorsTitlesSubjectsDepartment

    My Account

    LoginRegister

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Investigation of Transliteration Algorithm Operation in Real Time for Augmented Reality

    Thumbnail
    View/Open
    surc/2015/schedule/319/fulltext (1).pdf (278.9Kb)
    Date
    2015-04-10
    Author
    Battison, Jon L.
    Metadata
    Show full item record
    Subject
    Algorithmic Transliteration
    Augmented Reality
    Text Translation
    Abstract
    The ability to understand and translate languages is a sought after commodity. Modern computers are capable of translation, but require the user to disengage from their environment to operate. This research will show that the capability exists to create a device that would not separate the user with their environment while still allowing them to comprehend foreign languages. The goal of the research is the ability to produce an apparatus which can translate text seamlessly while moving its displayed field of vision in step with the users movements. To achieve this effect three distinct operations must be done quickly. First, input will be taken from the users’ perspective in the form of digital video using an Ovrvision 1 stereoscopic camera, which features a wider field of vision than that of the user allowing for predictive translation. Following this an Android translation algorithm will be applied to this input to filter out words of a language other than that of the user, in order to replace those words with the ones translated. The algorithm should do so in a way that simulates the words appearance in the raw input in order to offer the user an accurate reproduction of the environment. This augmented video will then be returned to the user by means of the Oculus Rift virtual reality system, thereby achieving the desired result of translating all that is in the users’ field of vision without disruption.
    URI
    http://hdl.handle.net/1951/72777
    Collections
    • 2015 SUNY Undergraduate Research Conference [409]

    SUNY Digital Repository Support
    DSpace software copyright © 2002-2023  DuraSpace
    Contact Us | Send Feedback
    DSpace Express is a service operated by 
    Atmire NV
     

     


    SUNY Digital Repository Support
    DSpace software copyright © 2002-2023  DuraSpace
    Contact Us | Send Feedback
    DSpace Express is a service operated by 
    Atmire NV