Virtual Reality

  • July 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Virtual Reality as PDF for free.

More details

  • Words: 1,742
  • Pages: 9
A Platform for location based Augmented Reality Applications

ADITYA INSTITUTE OF TECHNOLOHY AND MANAGEMENT …..PRESENTED BY…..

S. Sai Sateesh,

N.G.R.Reddy,

III/IV B.TECH (E.C.E),

III/IV B.TECH (E.C.E),

AITAM, Tekkali.

AITAM, Tekkali.

[email protected]

[email protected]

Abstract: Augmented Reality

a

user’s perception of the real world computer

generated entities,

and

mobile computing, allowing users to access manipulate

and in-

formation anytime

and

independent

of

location, are two emerging

user

interface technologies that show promise.

great The

combination

of

both into a single system makes the power

of

computer enhanced interaction

and

communication in the real world accessible anytime

resources at any

paper

location and at

describes

our work to build (AR),

enhancing

with

everywhere. This

a

mobile

Augmented Reality

and

related

often used as an interface

technique

that supports true

Augmented

stereoscopic

Reality

3D

any time. AR is user

work :

system

in

wearable (AR),

computing

graphics, a pen

annotating

the

because

it

and

real world with

provides

an

face and direct

computer

information

interaction

generated

space which is

pad

virtual

interwith

objects.

entities,

is

a

powerful

from

interface

accessible

off-the-shelf

paradigm

Information

hardware

allowing users to

be

components and

interact with

hands-free,

system

assembled

user

continuously and

is

The

transparently . can

accessed and

serves as a basic

the user’s view of

test bed for user

the real-world is

interface

computers in a

not interrupted, a

experiments

natural

requirement

related

to

way.

Mobilizing such

for

continuous use.

computer

an interface by

supported

deploying

If

collaborative

wearable

technologies are

computers is a

combined

Augmented

logical extension

position tracking,

Reality. It also

as the body of

location

describes

some

related

applications

we

shows.

work

in

applications

research

the

area

computing.

of based

these with aware are

possible.

The

computer

are developing in location

and

Introduction

Wearable

transparently

computing allows

changes

its

the user to access

behavior

based

on

the

environment without

the

user’s

The

mobile

AR setup :

intervention. An While

demonstrator for

computational

mobile

power

the

stereoscopic

both

rendering

head-

own.

video chip. The

On

one

hand this allows

device

us

quickly

1GHZ processor

old

and runs under

to

devices or add

has

a

Windows

new ones and to for

aware AR using a

GeForce2Go

upgrade

impressive location

for building our

change

the

configuration and

easily.

2000. We also

On

the

added a wireless

mounted and a

computer vision

other hand we do

is

LAN

network

hand-held display

not

the

available

adapter

to

is

and

note-book

Columbia’s

becoming in

obtain

smallest

Touring Machine

mobile computer

lightest

[3] which

systems, the size

possible.

was

used to create a

and

weight

campus

such systems is

information

still not optimal.

system

Hardware : The

situated

setup

solely

powerful portable

documentaries

build from off-

graphics solution

[4].

the-shelf

available

hardware

currently is a PC

and

is

to

enable communication

of

Nevertheless, our

system

the

most

with

our

stationary

setup

or

a

future

second

mobile

unit. It is carried by the user in a backpack.

notebook

As

avoid the effort

equipped with a

device, we use an

and time required

NVidia

i-glasses

components

to

an

output see-

through

provide

stereoscopic

virtual

stationary work-

accurate tracking

objects or with

space, our mobile

color HMD. The

on the pad itself.

user

setup with body-

display is fixed to

Figure 1 gives an

elements

a helmet worn by

overview of the

registered

the

setup.

and displayed on

3D

the pad.

in

user.

Moreover,

more

an

with

interface

stabilized display with

InterSense

allows to arrange information a

workspace

InterTrax2

User interface

travels

orientation sensor

management

with

and

a

camera

wearable

web for

that along

a

user.

Applications stay

software :

where they are

fiducial tracking

As our software

put relative to the

of

interaction

platform we use

user,

are

Studierstube 2.1

easily

mounted on the

[5],

accessed

helmet.

interface

anytime, aided by

management

proprioception

props

The main user interface is a pen and

pad

setup

using a Wacom graphics

tablet

and its pen. Both devices

are

optically tracked by

the

using

camera markers.

The 2D position of

the

pen

(provided by the Wacom tablet) is incorporated into the processing to

a

user

system for AR

Applications are

based on but not

implemented

limited

to

runtime loadable

3D

objects executing

stereoscopic graphics.

It

in

as

designated

provides a multi-

volumetric

user,

containers, a kind

multi-

application

of 3D window

environment, and

equivalent. While

supports a variety

the

of

stationary

display

original

devices including

Studierstube

stereoscopic

environment

HMDs. It also

allowed a user to

provides

the

arrange multiple

means of 6DOF

application in a

interaction, either

and

and

can be

spatial

memory. Figure 2 shows a simple painting application.

user’s locale will

passes through a

data

Figure 2. A user

be

unaffected,

series of steps. It

network of the

interacting with

but the second

is generated by

transformations.

the

paint

user will be able

tracking

The framework's

.

to see movement

hardware,

read

design is based

The view of the

of the graphical

by device drivers,

on XML, taking

user.

objects contained

transformed to fit

full advantage of

in the first user’s

the requirements

this

locale.

of the application

technology

and

allowing the use

application

Our

user

For

interface

effective

management

collaboration,

system

is

it

send

over

network

flow

new by

of standard XML

also

will in most cases

connections

of

be necessary to

other

managing

add

These tasks are

configuration and

multiple locales,

stationary locale

handled

documentation.

which

can

that

library

contain

any

graphical

OpenTracker [6],

number

of

applications that

an open software

graphical objects.

both users should

architecture

Locales

are

work with.

the

important

for

multi-user

or

capable

multi-display operation.

For

example,

each

mobile user will require a separate wear-able workspace

that

defines a distinct locale

third contains

system). As one moves

about, a second

hosts. by

a

called

for

different

tasks involved in

Tracking : Mobile

AR

requires significantly more

complex

tracking than a traditional

VR

typical VR or AR application tracking

tracking

input

devices

and

processing

application. In a

(coordinate user

a

to

data

multimodal input data. main

concept

behind

OpenTracker

is

to break up the data

manipulation into these

for

development,

OpenTracker uses

a

vision

tracking

library

called ARToolkit [7] to implement the tracking of the

fiducial

markers on the interaction props. It analyses the video

The

whole

tools

individual

steps and build a

images

delivered by the web

camera

mounted to the helmet

and

establishes

the

position of the pen

and

pad

relative

to

the

users head.

when

Location based

AR

the

looks at them and again

applications :

user

displays

the

correct

Location

Building on the

location of the

tracking :

mobile platform

book

described above

library shelves.

A

similar

technique is used to track the user’s position

within

the environment. Our

laboratory

and neighboring rooms are rigged with

larger

markers

along

the

walls.

The

locations of these markers

are

measured

and

incorporated in a

we are currently developing

a

number

of

prototype location

based

Augmented

in

the

A bookshelf was fit

out

with

fiducial markers used for tracking. Then

the

bookshelf’s

Reality

position can be

applications.

computed by the

These applications

are

based

the

on

location tracking described in the last section.

tracking library. Dedicated books were rigged as well with these markers, so that the

system

A simple location

recognizes such a

building.

based application

book when the

Together with the

is the AR library.

user is looking at

tracking

It performs two

it.

information

basic

prototype

delivered by the

Firstly, it shows a

application,

the

fiducial tracking

user the location

markers

are

the

of

requested

attached to the

the

book in the vast

wall instead of a

position

bookshelves of a

real shelf. Figure

library.

3

model

of

computes users within

the

system

these

a

tasks

:

And

rooms from the

secondly,

it

detected markers.

recognizes books

In

shows

modes.

the

both

Figure 3. The correct location of

a

detected

book

is

displayed.

A

selected book is

shown

in

the

shelf.

of

the

on

within this real

the tablet. The

environment. The

well as a means

user’s

way

to track the user’s

and current room

application

location

are

highlighted.

be ex-tended to

environment

as

environment

location

finding will

Another

typical

scenario

for

the environment.

She can select a

encompass a part

mobile

AR

As

destination

by

of our building to

within described

systems is a way

above

we

clicking into the

allow the user to

finding

prepared

the

room she wants

roam in a larger

application. The

environment

to

to go to. Then the

environment. The

aim is to guide a

allow the system

system computes

integration

user along a path

to compute this.

the shortest path

both applications

to

For each room a

to this room and

is straightforward

destination. This

set of

highlights

the

because of the

is accomplished

was set up and

rooms she needs

multi application

using two means:

their

to

features of the

a

measured.

a

selected

world

in

markers locations The

cross.

of

Additionally the

Studierstube

miniature model

tracking can now

doors

she

system. This will

of

establish

the

needs to take are

allow the user to

location

augmented in the

find her way to the library and

the

that

environment with

user’s

the users location

and the direction

user’s

and

she is looking in.

guide her along

then

pathhighlighted

Thus the system

and augmenting

can continuously

the user’s view

display

with

navigation

navigation

information

arrows,

registered to the

lighted doors and

Such

an

use

the

the path to the

library

destination.

application

in

Future work :

real world.

The

prototype

applications

lines along the desired path.

to

place.

guides such as high-

view

are

In the application

not finished yet.

itself the user is

We

plan

to

presented with a

augment a real

application

miniature model

library and test

requires a model

of

the

the

application

Conclusion: This

paper

describes

our

work to develop a

mobile

platform allows

AR that

location-

based computing. While

most

related

work

focuses

on

providing

R.:

A

Survey of

information text

 Azuma

as 2D

ed

overlays,

we

Reality.

concentrate

on

pp.

3D

or

Augment

information

355-

385,

that the user can

August

interact

1997.

with.

First we describe

 Starner

the mobile setup

T.,

itself

consisting

Mann, B.

of the hardware

Rhodes, J.

used

Levine, J.

and

the

S.

software system

Healey,

developed. Then

D. Kirsch,

we describe two

R. Picard,

prototype

Presence,

applications are

we

currently

developing

to

demonstrate

the

abilities of the platform.

References:

Vol.

6,

No. 4, pp. 386-398, August 1997.

Related Documents

Virtual Reality
May 2020 18
Virtual Reality
November 2019 47
Virtual Reality
May 2020 16
Virtual Reality
July 2020 12
Virtual Reality Paper
November 2019 42