WRS 

Project Phase 2 - FINAL

 

Title: Theia - Navigating indoors for the visually impaired people

Vision Document Link

 Presentation Link

Mockup Link

Questionnaire Link

 

Team Members:

        Adit Shah (axs190336)

Meet Chanchad (mxc210021)

Jeel Patel (jpp210001)

Aditya Veer (axv210087)

Anushruti Singh (axs220183)

Mayank Goyani (mxg200078)

Team URL: https://cs6361.meetc.dev 

 

 

Date:  05/04/2023


Revision History

Version

Date

Description

Author(s)

Vs 1.0 - Preliminary Project Plan

01/26/2023

First draft of the project for preliminary project plan

Jeel, Adit, Meet, Anushruti, Aaditya and Mayank

Vs 2.0 - Final Project 1

03/28/2023

Second draft of the project with mockup and WRS document development

Jeel, Adit, Meet, Anushruti, Aaditya and Mayank

Vs 3.0 - Interim Project 2

03/18/2023

Third draft for interim project 2 with minor additions

Meet, Jeel, Adit, Aaditya, Anushruti and Mayank

Vs 4.0 - Final Project 2

05/04/2023

Third draft for interim project 2 with minor additions

Anushruti, Meet, Adit, Jeel, Aaditya and Mayank

 

Process (team members, roles, team leaders, meetings, input, activities, output, resources,…)

We held weekly meetings where the entire group would come together and talk about ideas, features, FRs, and NFRs for the final stage of our project. The purpose of these meetings was to identify problems with the deliverables of interim project 2 and to negotiate and systematically choose the best option for future development of the THEIA app and repairs to these problems. We made the decision to use the time in between sessions to collect more problems on an individual basis and discuss potential solutions at the meetings.


  1. Introduction

  1. Purpose

The team's first efforts to better understand the Preliminary Project Plan, which served as the basis for the creation of the Theia Project's software, are captured in this document, along with the outcomes of those efforts. It details the issues that were encountered, the solutions that were considered to address them, the final decisions that were made, and the reasoning for each decision.

  1. Scope

This document describes a smartphone app that enables a blind people to navigate indoors. The primary function of the application is to provide directions/voice commands to the user while navigating indoor spaces. As blind person who needs to move from one location to another location within a building faces challenges. The safety of the user would be one of the primary concerns of this app. For instance, If the user meets with an accident, the application will assist in an emergency.         

As smartphones have become so commonplace in recent years, the program specifically targets that market. The users will find this application more helpful than the present traditional assistance because they might not need to buy new gadgets.

  1. Definitions, acronyms, and abbreviations

OS: Operating System

Android: The operating system utilized by a large portion of the smartphones.

IOS: The operating system primarily used by Apple devices .

Theia: Greek Goddess of Vision.

PPP: Preliminary Project Plan

Caretaker: An assistive person available to the blind( person with impaired vision) at the time of need.

GPS: Global Positioning System

Sensors: Any additional sensory devices installed into or connected the smartphone which may be used by the application throughout its operation.

Auditory: Related to the hearing senses or including audio.

ADA: The Americans with Disabilities Act.

  1. Document Overview

This paper's introduction provides an overview of the goals, parameters, and acronyms used throughout the body of the document. A summary of what to expect in the rest of the document is also provided in the first section. The concerns detected and judgments made will be listed in the second section, along with their justifications. These issues were discovered in the early requirements definition document for this project.This section will be followed by the team's enhanced comprehension of the specification of the preliminary requirements. The fourth section of the materials gives a thorough explanation of the prototyping development. The traceability of the identified needs is provided in the fifth section. The last section gives inventory of sources/references  used.


2. Issues with Preliminary Definition Given (ambiguities, incompleteness, inconsistency, conflicts, ...)

        

2.1 Issues with II.1 The Domain, Stakeholders, Functional and Non-Functional Objectives 

2.1.1 Issue-1: Ambiguous definition of the blind person (Unsighted).

Issue Description: According to the preliminary definition, a blind person is the main stakeholder. However, the severity of the deficiency is not considered. Just color blindness, partial blindness, or total blindness are all possible. The individual could have developed blindness later in life or could have been born blind.

Options:

  1. We limit the scope of our definition of blindness to individuals who are totally blind.
  2. Any individual with vision impairment, regardless of severity, is regarded as a project stakeholder.

Decision and Rationale : The software must support users with all degrees and types of blindness. Any stakeholder who has vision impairments is permitted to utilize the application.

2.1.2 Issue-2: Ambiguous definition of the building/part of the building to be navigated by the application.

Issue Description: What areas of the building will be covered by the application are not specified in the preliminary definition. Furthermore, the application does not make it clear which spaces and rooms it covers.

Options:

  1. Cover all the parts, floors, rooms and areas of all three ECS buildings (ECSS,ECSN,ECSW).
  2. Cover one building and every part of the building including all the floors of that building.
  3. Cover connected parth of the ECSS and ECSN buildings.

Decision and Rationale: The ground floor of the connecting sections of the ECSS and ECSN building must fit inside the app. Given that there is a clear path between the ECSS and ECSN buildings, the application's complexity will be controllable. All of the restrooms, labs, lecture halls, auditoriums, sitting areas, and lounges in the aforementioned building locations will be covered by the application. Since it would be challenging for both the smartphone and the user to determine their current floor, we decided against supporting multiple floors.

2.1.3 Issue-3: Ambiguous definition of the secondary stakeholders

Issue Description: The personnel involved in setting up the app or responding to emergencies are not specified in detail under the preliminary definition.

Options:

  1. A caretaker or the blind person will be able to set the configuration based on text or voice prompts respectively.
  2. Users will be given an option to set the emergency contact based on their preference and will default to calling 911 in case the emergency contact list is empty.

Decision and Rationale: The following parties will be considered secondary stakeholders in this project:

  1.  A third party assisting the blind individual during the app's initial configuration, if necessary.
  2. A family member or caregiver who is identified as the blind person's emergency contact. If the blind person gets lost, hurt, or has an emergency, a call should be made to this person.
  3. A member of the emergency service or assistive services. If a call is made to one of these organizations by a blind person using this app.

2.1.4 Issue-4: Ambiguous definition of input method.

Issue Description: The definition of the non vision based input and interaction medium with the app is not clearly defined.

Options:

  1. By making use of a voice over feature of the OS, the app will allow users to interact with the application.
  2. The user should be able interact with the application with a combination of touch input and haptic feedback from the application.

Decision and Rationale: The application will utilize both the methods and provide redundancy. In case of either one of them fails, the user will still be able to communicate with the application.

2.1.5 Issue-5: Incomplete specification of the languages that the user will be able to interact with the system.

Issue Description: What languages will be supported by the application for user interaction is not explicitly stated in the specification.

Options:

  1. Only allow the English language to communicate and interact with the application.
  2. Allow the user to select from a list of languages supported by the application at the time of configuration.
  3. Allow the user to change their preferred language selection at their will.

Decision and Rationale: The application will utilize both options II and III and allow the user to select a language at the time of setup and changeable anytime through the settings menu.

        

2.2 Issues with II.2 Software System Requirements: Functional Requirements 

2.2.1 Issue-1: Incomplete definition of how the user enters the destination location.        

Issue description : Here, the method by which the app accepts the destination location is not made explicit. Additionally, it is stated that the app may use the user's regular schedule, but what if the user wants to visit a different location than their regular one? Would the user be able to look for a specific location to go to?

Options:

  1. The app will suggest some locations based on user habits. The user will select one of them.
  2. The user will be able to enter the location manually through either voice input or via accessibility typing.
  3. System might use the user's daily routine schedule but the user should be informed before finalizing the destination location.

Decision and Rationale: A combination of all the options. The user shall use the recommendation or they can opt to enter some other location manually.

2.2.2 Issue-2: Ambiguity on how the routes will be calculated and curated to the user. How these routes will be selected by the user.

Issue description :  The user will be presented with an undetermined number of options for potential routes to the destination. how much power the app or user will have to choose the optimal path. Will the user's choices be taken into consideration in the computation.

        

Options:

  1. Multiple routes can be prompted to the user.
  2. Shortest path will be selected by the app.
  3. Calculated the route best suited to the user based on their preferences and past habits.

Decision and Rationale:

Since it is indoor navigation, the user would benefit most from the best path option. The application will start giving the user navigation instructions as soon as they enter a specific destination.

2.2.3 Issue-3: Incomplete definition of how the instructions will be provided to the user while navigating.

Issue description : The original definition claimed that it would help users by giving them routes based on their location. However, it is not stated how the users will be informed of these instructions. It is not specified whether or not the numerous sensors and actuators integrated within cell phones will be utilized. Additionally, it is not yet clear how these sensors will be used.

Options :

  1. When the user confirms their destination location, a clear chime and a designated haptic pattern will be played by the phone’s vibration feedback mechanism.
  2. Followed by this, the audio narrative instructions on the numeric measurements on how much the user needs to step in what direction will be provided to the user.
  3. On reaching the destination location the user will be notified with a different designated haptic feedback and a chime.

Decision and Rationale:

The best option will be use the haptics, sound

2.2.4 Issue-4 Ambiguity in how calls will be placed in case of an emergency.

Issue description : Emergency calls and messages will be triggered possibly after detecting a fall or when the system cannot figure out the current location. The user should also be able to place an emergency call manually if they encounter an emergency situation.

Options :

I. Ring a loud alarm on the app to alert the surroundings and bring it to attention.

II. Users can dial 911 availability as well.

Decision and Rationale:

  1. In case a fall is detected, the app shall ring an alarm to notify the surroundings.
  2. The user shall place an emergency by pressing an easily accessible emergency SOS button on the screen.

2.2.5 Issue-5: Figuring out what would be the next action, based on the user’s habit.  

        

Issue description : What should be done next in this situation is unclear; should the user take the steps the app suggests while navigating, or should it predict and suggest where they should go next based on their schedule?

        

Options:

  1. Store the locations that the user visits at a particular time. Use this data to recommend the next location the user might want to visit at the same time next week.
  1. Using the stored location, provide shortcuts to set the destination. Users can use these shortcut inputs to set their destination with minimal interaction.
  1. Depending on the destination entered by the user at a particular time, compute the optimal path and direct the user to the destination depending on their current location.
  1. Track user location through the GPS system built into the mobile phone. Depending on this location, direct the user on the next action that they need to take in order to reach their destination.
  2. Depending on the distance to the point that they need to take the next action (Eg. turn left, turn right, go straight), provide advance prompts on how far they need to walk before taking that action.

Decision and Rationale: The mobile app would use previous locations, chosen destinations, GPS position, and best path calculation to deliver personalized suggestions and navigation directions with prompts.

2.3 Issues with II.3 Software System Non-Functional Requirements 

         

2.3.1.Issue-1: Redundancy in input methods.

Issue description: It is important that special attention is paid on user experience because most potential users may have difficulty with their eye sight of various degrees and it should incorporate features that make it easy to use.

Options :

  1. Enabling voice recognition because that is the ideal solution for a visually impaired candidate to be able to use a mobile application.
  2. For people with partial visibility enable touch and vision based input methods.
  3. Users should refer user guide/manual for new users

Decision and rationale:

A combination of all the proposed solutions will be implemented to provide the user a good user experience.

2.3.2 Issue-2: Configurability: Are the blind people able to configure the app by themselves?

Issue Description: According to the preliminary definition, some secondary stakeholder might involve setting up basic configuration of the app. But, what if the caretaker is not available with the blind person for the initial configuration of the app and account. What is the level of interaction and intervention we expect the caretaker to be involved at ?

Options:

  1. If a caretaker is available, then they can configure the app.  

 II.  By using the voiceover (using accessibility) feature of the smartphone, blind people can      configure the app by themselves.

 

Decision and Rationale: The app shall accommodate both the options as we discussed blind person should be able to configure the app by themselves and using the caretakers app. If the caretaker is available, then configuration time will be saved.

2.3.3 Issue-3: What is “safety” and “comfortability” in the context of indoor navigation?

Issue Description: To provide safe and comfortable navigation, we have to add some way to detect and alert the blind person about obstacles like a person, maintenance cart, tables, chairs, etc. Hence, we have to support object detection and for that the user needs to hold the device such that the rear camera is pointing towards the path the user is going.

Options:

  1. If the user wants object detection then they should hold their smartphone according to the guidelines.
  2. If the user opts out of the object detection or holds their phone in a different way then voice commands alone will work.

        

Decision and Rationale: By making use of cameras of the smartphone, the system shall provide object detection and the application can assist users about objects which are on their way by voice commands.

        

2.3.4 Issue-4: User preferences - Customizability

Issue description: A user must be able to set the configurations of the application according to their convenience.

Options:

I.  Allowing the user to customize elements like languages for both user input and the system output.

Ii. Having rigid configurations made assuming one set of choices based on the majority of the users.

 

Decision and rationale

Going with option 1 and enabling the user to pick from multiple languages

2.3.5 Issue-5: Security: User’s phone is lost or misplaced.

Issue description: For instance, if the user has misplaced the phone, to avoid the misuse of the user information if the device is found by someone else, and disable anybody besides the user and the caretaker to access the emergency contact.

Options:

I. The application may authenticate the user based on the biometrics at regular intervals

II. The application may use voice recognition to identify the bona fide user.

III. The application may request for the user credentials every time it is to be used.

IV. The application may authenticate  the user based on the biometrics when accessing or editing some personal or emergency contact information.

Decision and rationale:

A combination of option 2 and 4 will be implemented. The second option is the most feasible because having to authenticate the user constantly via biometric will need the user to hold the phone in their hands at all times, and asking to type in the user credentials conflicts with the ease of usability of the project. Voice recognition is the best solution because it will allow you to have your hands free.

2.3.6. Issue-6: User meeting with an accident - safety 

Issue description: In case the app detects an accident, then how to know if it was a genuine one or a false alarm and how to handle both cases?

Options:

I. send out an alarm to the emergency contact selected by the user and/or the ambulance.

II. Ring a loud alarm on the app to alert the surroundings and bring it to attention.

III. If the app detects that the user fell down, it will give a prompt to the user to call the emergency contact.

Decision and rationale : The best option will be to provide the prompt and default to calling the emergency contacts and giving the option to the user to cancel it within a timeout if an accident is detected by the device.

3. WRS

        

3.1 W

        

3.1.1 Problem 

Navigating indoors can be very challenging for those who have poor vision or none at all. When you try to enter a crowded hallway, this issue becomes even more noticeable. Keeping up with obstacles and remaining on the right road can be an extremely difficult, risky, and intimidating undertaking. Additionally, relying solely on the braille markings on door entrances can make it very challenging for a blind person to find their way back.

        

3.1.2 Goal

Our aim is to develop a smartphone software that offers a very simple and user-friendly means of aiding the blind and visually handicapped. using the mobile device's touch screen, button presses, auditory input, and haptic and auditory output. With the use of this app, blind people's indoor navigational challenges should be reduced.

                

3.1.3 Improved understanding of II.1 The Domain, Stakeholders, Functional and

Non-Functional Objectives

  1. (DRS1)The application shall be used by people (henceforth be referred to as blind people), regardless of the level of their deficiency, with vision disorders or impairments.(2.1.1)

  1. (DSR2) The system shall be used by the blind person when navigating through the ground floors of the connected parts of the ECSS and ECSN buildings. (2.1.2)

  1. (DRS3)The system shall be used by secondary stakeholders which may include an assistive person, a caretaker, a family member, an emergency personnel and a person helping the primary stakeholder set-up the app on their device. (2.1.3)

  1. (DRS4)The system shall provide the blind person with touch, haptic and voice based input methods to input and interact with the application.(2.1.4)

  1. (DRS5)The system shall provide a special input method or interface to the user for making emergency calls, these emergency services may include 911, assistive services, hospitals or special emergency contacts. (2.1.4)

3.2 RS

                

3.2.1 Functional RS – Improved understanding of II.2 Software System Requirements: FRs

  1. (FRS1) The system shall provide the following sensory aid to the blind person when entering and confirming their intended destination.
  1. (FRS1.a) An auditory confirmation of the destination they intend to reach. (2.2.1)
  2. (FRS1.b) A haptic response on destination confirmation. (2.2.1)

  1. (FRS2) On detecting that the user’s location is the same as the destination location through the phone's GPS system, the system shall.
  1. Record the coordinates of the starting location, the time navigation started, the time when the user reached the destination and  the coordinates of the destination location. (2.2.5)
  2. Provide a unique chime and tactual sensation notifying the user has reached their destination. (2.2.5)

  1. (FRS3) The system shall provide the instructions to the user after they select the destination.
  1. (FRS3.a) Some haptic command will be given once the navigation is started. (2.2.3)
  2. (FRS3.b) Voice commands will be given to the user. (2.2.3)

  1. (FRS4) The system shall include an accessible button on the home screen of the application to initiate emergency calls. This may include calling 911, to an emergency contact, contacting assistive services or to send a message to a nearby department such as hospital. (2.2.4)

  1. (FRS5) When the user confirms their destination location the system shall.
  1. Select the optimal route based on the user's current location detected by the smart phone's gps system. Then start directing the user on the selected route.
  2. Provide the user with an option with touch or voice input to select other calculated routes. (2.2.2).

  1. (FRS6) Upon arrival time of the week specified in the routine classified visit the system shall.
  1. Play a unique chime notifying the user to visit the location stored in the records.
  2. Allow the user with the option to cancel or confirm the navigation.
  1. Case 1: The user confirms the navigation. Set the destination location as the stored records indicate and initiate actions specified in FRS1.
  2. Case 2: The user cancels the navigation. Display the home screen with an auditory and haptic feed for the cancellation. (2.2.5)

  1. (FRS7) The system shall automatically send an SOS to emergency contact in case of accident detection. (2.3.6)

  1. (FRS8) The system shall provide a proper authentication mechanism for verifying a genuine user while accessing personal information. (2.3.5)

  1. (FRS9)Using the camera and computer vision and object detection models running on the smartphone. If the system detects the user is walking towards and getting closer to an obstacle in their path, the system shall.
  1. Provide an auditory warning notifying the user they are about to collide with an obstacle. The intensity and volume of this warning will increase with decreasing distance between the user and the obstacle.(2.3.3)
  2. Provide a haptic/vibratory feedback using the phone’s haptic vibrator with increasing frequency and intensity with decreasing distance between user and the obstacle.(2.3.3)

  1. (FRS10) Followed by the completion of actions specified in FRS2, If the records indicate that the user has visited the same location on the same week day for two consecutive weeks, the system shall record it as a routine visit and add a category to the previously stored records. (2.2.5)

  1.  (FRS11) After downloading, installing and opening the app on the user's smartphone for the first time the system shall provide the following.
  1. Allow the user to select from a list of languages they would like to use the application.(2.3.4)
  2. Allow the primary stakeholder to select whether they would like to configure the app by using the phone’s accessibility features or they have a secondary stakeholder to help them set up the application.(2.1.1)
  3. Provide a form to fill in their emergency contact information. The form shall also include entering their preference on who would be the primary emergency contact to dial in case of an emergency.(2.3.3)

        


3.2.2 Non-functional RS - Improved understanding of II.2 Software System Requirements: NFRs

  1. (NFRS1) The system shall allow the blind person to intuitively enter the intended location they want to reach. (2.3.1)

  1. (NFRS2) The system shall offer ease of use by having a less steep learning curve. (2.3.2)

  1. (NFRS3) The system shall provide safety to the user by detection and prompting of obstacles in the user’s path and providing an emergency call function. (2.3.3)

  1. (NFRS4) The system shall offer redundancy by providing multiple non vision based modes of communication with the application.

  1. (NFRS5) The system shall offer comfortability by providing usage of multiple languages in the app. (2.3.4)

  1. (NFRS6) The system shall provide dependability  and reliability by providing distinct and proper prompts when starting, during and at completion of the navigation. Additionally, allow the secondary stakeholders to interact with applications with visual based methods. (2.2.5)

  1. (NFRS7) The system shall offer by storing the user's routine and recommending route navigation based on this stored data. (2.2.2)

  1. (NFRS8) The system shall offer security by providing a voice based authentication method. (2.3.5)

4. Preliminary Prototype and User Manual

5. Traceability (both forward and backward; among W, FRS and NFRS; and also possibly between before and after)

  1. Traceability among issues and Requirement specifications.

        

Requirement ID

Requirement category

Issue ID

Issue category

DRS1

Domain Requirement

2.1.1

Domain issue

DRS2

Domain Requirement

2.1.2

Domain issue

DRS3

Domain Requirement

2.1.3

Domain issue

DRS4

Domain Requirement

2.1.4, 2.2.1, 2.3.1

Domain ,functional and non-functional issues

DRS5

Domain Requirement

2.2.4, 2.3.3, 2.3.6

Functional and Non-functional issue

FRS1

Functional Requirement

2.1.4, 2.2.1

Domain and Functional issue

FRS2

Functional Requirement

2.2.5

Functional issue

FRS3

Functional Requirement

2.2.3

Functional issue

FRS4

Functional Requirement

2.1.3, 2.2.4, 2.3.3, 2.3.6

Domain ,functional and non-functional issues

FRS5

Functional Requirement

2.2.1, 2.2.2

Functional issue

FRS6

Functional Requirement

2.2.2, 2.2.5

Functional issue

FRS7

Functional Requirement

2.2.4, 2.3.3, 2.3.6,

Functional and Non-functional issue

FRS8

Functional Requirement

2.3.5

Non-functional issue

FRS9

Functional Requirement

2.2.3, 2.3.3

Functional and Non-functional issue

FRS10

Functional Requirement

2.2.5

Functional issue

FRS11

Functional Requirement

2.1.1, 2.1.3, 2.1.5, 2.3.2, 2.3.4

Domain  and non-functional issues

NFRS1

Non - Functional Requirement

2.1.4, 2.2.1, 2.3.1

Domain ,functional and non-functional issues

NFRS2

Non - Functional Requirement

2.1.5,2.3.2

Functional and non-functional issues

NFRS3

Non - Functional Requirement

2.2.3, 2.2.4, 2.3.3, 2.3.6

Functional and non-functional issues

NFRS4

Non - Functional Requirement

2.1.4, 2.2.1, 2.3.1

Domain ,functional and non-functional issues

NFRS5

Non - Functional Requirement

2.1.5, 2.3.2, 2.3.4

Domain and non-functional issues

NFRS6

Non - Functional Requirement

2.1.3, 2.2.1, 2.2.2, 2.2.3, 2.2.5, 2.3.1, 2.3.2, 2.3.3,

Domain ,functional and non-functional issues

NFRS7

Non - Functional Requirement

2.2.2, 2.2.5

Functional issue

NFRS8

Non - Functional Requirement

2.3.5

Non -functional issue

  1. Traceability among  Functional and Non-functional Requirement Specifications.

ID

NFRS1

NFRS2

NFRS3

NFRS4

NFRS5

NFRS6

NFRS7

NFRS8

FRS1

X

X

FRS2

X

X

X

FRS3

X

X

X

FRS4

X

FRS5

X

X

FRS6

X

FRS7

X

FRS8

X

FRS9

X

FRS10

X

FRS11

X

References:

https://www.fastcompany.com/90768125/this-app-gives-people-who-are-blind-step-by-step-audio-directions-to-easily-navigate-public-transit 

https://www.orcam.com/en/myeye2/