Android based intelligent walking stick for visually impaired




 Android is a software stack and mobile
operating system that includes the operating system for portable devices,
middleware, user interface. Android developers were able to write applications
in the Java language, a runtime library that can run the compiled byte code
Java Runtime Library. In addition to this, it provides the required application
through the Android Software Development Kit SDK to develop a variety of tools
and APIs. Android works on the Linux kernel and the Android system uses C / C +
+ libraries. are included. Android, unlike existing Java virtual machines, uses
a Java application made of Virtual machine that runs on a separate process. In
2005, Google acquired Android. and in November, 2007 Google announced to freely
open Android platform to the public. After the advertise, 48 different
hardware, software, and communication companies collaborated to design Open
Handset Alliance, OHA and it has been developing an open-to-public standard.
Google distributed all source code of Android as Apache v2 license so that
companies or users can independently develop Android program. In these
construction components, it is divided into a total of 5 classes of
application,   library, Android runtime,
application framework, and Linux kernel. Handset layout platform is adaptive to
expand 3D graphic library based on OpenGL ES1.0, VGA, and 2D graphic library,
and it uses SQLite database software for a purpose of data storage. Android
supports connection technologies including GSM/EDGE, CDMA, EV-DO, UMTS, Bluetooth,
and Wi-Fi. It also supports a web browser based on an open source, application
framework and it allows the usage of touch screen that is supported by
additional hardware, GPS NAVIGATION, acceleration sensor, compass sensor, and
3D graphic acceleration.  

We will write a custom essay sample on
Android UMTS, Bluetooth, and Wi-Fi. It also supports
Specifically for you for only $16.38 $13.9/page

order now

Positioning System)

 GPS is a radio navigation system using
satellites and it is developed by United States Department of Defense for
military use navigation but it can be used by citizens with a limited range. It
predicts radio coverage from satellites to a receiver, and then it shows the
exact 3D location, speed and time. This system can be universally used for 24
hours, and many people can use it. This GPS system can be dived into 3
different segments; Space Segment, Control Segment, and User Segment. Space
Segment represents the location of twenty four satellites that rotate around the Earth every twelve
hours. Since of April, 2007 there is also a total of 36 GPS NAVIGATION satellites
with 30 of them are active and 6 of them are preparatory satellites In case of
failure any satellite.  Control Segment
represents a general observation post that manages and tracks GPS satellites.
User Segment represents GPS users and GPS receiver device.

Google Maps Android API:-

Many applications that pertain to the
user’s location or places in general need to be dealt with by Google Maps and
provide us with Google APIS To handle maps Maps API,Places API,Geocoding
API,Road API, Direction API ..Etc. The Google Maps Android AP includes built-in
support for accessibility. 

Smartphone application design

We can use Android APIs to
facilitate use of the device by people without sight or people with reduced
vision. The Android option is the Talkback Gestures software which ships with
the platform. The developer can make this feature functional by using the android:
content Description attribute on interface controls, and making sure that these
are all reachable by the user. There are also particular API calls included in
Android 4.0.


Software design


The software on the system was designed to provide notifications as an
early warning for obstacles. The stick itself is to maintain its usual function
for detecting obstacles, but be augmented by the technology. The software will
also provide a notification if there is a detected change in gradient. The
architecture of the system was an event-driven architecture. The software on
the processing hardware will drive the system by calling the “detect”
subroutine alternatively on the two obstacle detecting sensors every time
round, and the gradient one on a less frequent cycle. The controlling software
will then change state based on the received input from the sensors, and, if
required, pass on a notification (event) to the connected stick, via the
application .The controller will then, once the message has been handled,
return to the base state. Back the distance data. The controller will then
analyses the data and determine the distance of objects ahead. The distance
data will be then filtered to determine the range of distances that count as an
object. If the object is in the range that counts as “early warning” for an
object (1m– 2.5m) then a notification will be called on the connected stick.
The pattern and strength of the notification that gets called will depend on
the closeness of the object. Also, the detection of ground gradients will be
based on the angle between the sensor’s distance and the ground, calculated via
trigonometry. The angle will be held in a buffer, and if the angle changes more
than an allowed variable, a notification will be called on the connected stick,
via the smartphone app. The notifications will take the form of a Bluetooth
serial message sent via the serial connection on the Bluetooth chip. This will
be receivable by the application via the android Bluetooth.


I'm Dora!

Would you like to get a custom essay? How about receiving a customized one?

Click here