-
Notifications
You must be signed in to change notification settings - Fork 0
/
openscience.html
140 lines (118 loc) · 10.4 KB
/
openscience.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
<!DOCTYPE HTML>
<!--
Spectral by HTML5 UP
html5up.net | @ajlkn
Free for personal and commercial use under the CCA 3.0 license (html5up.net/license)
-->
<html>
<head>
<title>Open Science | Intuitive Computing Laboratory</title>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1, user-scalable=no" />
<link rel="stylesheet" href="assets/css/main.css" />
<link rel="stylesheet" href="assets/css/icl.css" /> <!-- lab custom css -->
<noscript>
<link rel="stylesheet" href="assets/css/noscript.css" /></noscript>
</head>
<body class="is-preload">
<!-- Page Wrapper -->
<div id="page-wrapper">
<!-- Header -->
<header id="header">
<h1><a href="index.html">Intuitive Computing Laboratory</a></h1>
<nav id="nav">
<ul>
<li class="special">
<a href="#menu" class="menuToggle"><span>Menu</span></a>
<div id="menu">
<ul>
<li><a href="index.html">Home</a></li>
<li><a href="news.html">News</a></li>
<li><a href="team.html">Team</a></li>
<li><a href="research.html">Research</a></li>
<li><a href="openscience.html">Open Science</a></li>
<li><a href="publications.html">Publications</a></li>
<li><a href="join.html">Join Us</a></li>
</ul>
</div>
</li>
</ul>
</nav>
</header>
<!-- Main -->
<article id="openscience_main" class="alt">
<header>
<h2>Open Science</h2>
<p></p>
</header>
<section class="wrapper style5">
<div class="inner">
<p style="font-size: larger;">We are committed to disseminating our knowledge and research products to the scientific community and the public.
Our research implementations are available on our Github page (<a href="https://github.com/intuitivecomputing" id="pub">https://github.com/intuitivecomputing</a>). Also, check out our <a href="https://www.youtube.com/channel/UCrJ4qGBn6B4u1iBxUfekmqQ" id="pub">Youtube page</a> for research talks and demo videos.
</p>
<hr />
<h2>Demoshop: An Interactive Robot Programming Tool for Authoring and Editing Task Demonstrations</h2>
<p><span class="image left"><img src="images/openscience/demoshop.png" alt="Image showcasing a user using Demoshop and its continuous path discretization, programming aids, and contextual visualization features" /></span>
<a href="https://github.com/intuitivecomputing/demoshop" id="pub">Github page</a> | <a href="https://intuitivecomputing.jhu.edu/publications/2021-ras-ajaykumar.pdf" id="pub">Related publication</a> | <a href="https://www.youtube.com/watch?v=qvTMBZkvxwM" id ="pub"> Video demo</a>
<br><br>
Just as end-user programming has helped make computer programming accessible for a variety of users and settings, end-user robot programming has helped empower end-users to develop custom robot behaviors. While end-user robot programming methods such as kinesthetic teaching have introduced direct approaches to task demonstration that allow users to avoid working with traditional programming constructs, everyday people still have difficulties in specifying effective robot programs using demonstration methods due to challenges in understanding robot kinematics and programming without situated context and assistive system feedback. <br><br>
Demoshop is an interactive robot programming tool that includes visualization tools and user-centric programming aids to help end-users develop and refine their task demonstrations more easily and effectively.
</p>
<hr />
<h2>FACT: A Full-body Ad-hoc Collaboration Testbed for Modeling Complex Teamwork</h2>
<p><span class="image left"><img src="images/openscience/FACT.png" alt="Image showcasing bunkbed structure, mobile data collection backpack, head and chest-mounted camera setup, and first-person views used within FACT" /></span>
<a href="https://github.com/intuitivecomputing/FACT" id="pub">Github page</a> | <a href="https://arxiv.org/pdf/2106.03290.pdf" id="pub">Related publication</a>
<br><br>
Robots are envisioned to work alongside humans in applications ranging from in-home assistance to collaborative manufacturing. Research on human-robot collaboration has helped develop various aspects of social intelligence necessary for robots to participate in effective, fluid collaborations with humans. However, HRC research has focused on dyadic, structured, and minimal collaborations between humans and robots that may not fully represent the large scale and emergent nature of more complex, unstructured collaborative activities. FACT (Full-body Ad-hoc Collaboration Testbed) is an openly accessible resource for researchers to better model natural, ad-hoc human collaborative behaviors and develop robot capabilities intended for colocated emergent collaborations.
<br><br>
</p>
<hr />
<h2>Dataset: Audio-Visual Representations of Object Drops</h2>
<p><span class="image left"><img src="images/openscience/object_permanence.jpg" alt="" /></span>
<a href="https://doi.org/10.7281/T1/EP0W7Y" id="pub">Dataset</a> | Bu, Fanjun; Huang, Chien-Ming, 2020, "Dataset: Audio-visual representations of object drops", <a href="https://doi.org/10.7281/T1/EP0W7Y">https://doi.org/10.7281/T1/EP0W7Y</a>, Johns Hopkins University Data Archive, V1
<br/>
<a href="https://github.com/intuitivecomputing/Object_Permanence_through_AudioVisual_Representations" id="pub">Github page</a> | <a href="https://ieeexplore.ieee.org/document/9547333" id="pub"> Related paper</a> | <a href="https://www.youtube.com/watch?v=Rj-ZZf3r4g8&feature=youtu.be&ab_channel=IntuitiveComputingLaboratory" id ="pub"> Video demo</a>
<br><br>
As robots perform manipulation tasks and interact with objects, it is probable that they accidentally drop objects that subsequently bounce out of their visual fields (e.g., due to an inadequate grasp of an unfamiliar object). To enable robots to recover from such errors, we draw upon the concept of object permanence—objects remain in existence even when they are not being sensed (e.g., seen) directly. This dataset was created to address the above challenge. The data was collected by having a Kinova Gen3 Robot Arm repeatedly pick up a wooden cube (3cm x 3cm x 3cm) and release the cube 0.3 meters above a table surface. A 7-channel microphone array was used to record the impact sound from when the robot gripper opened to drop the cube. A camera mounted on the robot’s wrist observed the beginning of the cube’s trajectory before the cube bounced out of view, and a ceiling camera recorded the rest of the trajectory. Audio recordings were saved as WAV files, and the cube’s trajectory was saved as NumPy arrays, with two versions: the partial trajectory observed from the robot wrist camera and the complete trajectory obtained by merging two partial trajectories.
</p>
<hr />
<h2>PySocialForce: A Python Implementation of the Extended Social Force Model for Pedestrian Dynamics</h2>
<p><span class="image left"><img src="images/openscience/group_crossing.gif" alt="" /></span>
<a href="https://github.com/yuxiang-gao/PySocialForce" id="pub">Github page</a> | Related publication (available soon)
<br><br>
Modeling pedestrian dynamics has a variety of valuable applications, ranging from emergency simulation and urban planning to crowd simulation in video games and movies. Pedestrian simulation also plays an important role in developing mobile robots that are capable of navigating crowded human environments in a safe, efficient, and socially appropriate manner. <br><br>
PySocialForce is a pure Python package for simulating crowd dynamics based on the extended social force model. While it can be used for general crowd simulation, it is designed with social navigation applications in mind. Our Python implementation makes it easily extensible (e.g., adding new custom “social forces”) and able to interface with modern reinforcement learning environments (e.g., OpenAI Gym).
</p>
<hr />
<h2>PATI: A Projection-Based Augmented Table-Top Interface for Robot Programming</h2>
<p><span class="image left"><img src="images/research/iui19-pati.png" alt="" /></span>
<a href="https://github.com/yuxiang-gao/PATI" id="pub">Github page</a> | <a href="http://www.cs.jhu.edu/~cmhuang/documents/publications/gao2019pati.pdf" id="pub">Related publication</a>
<br><br>
As robots begin to provide daily assistance to individuals in human environments, their end-users, who do not necessarily have substantial technical training or backgrounds in robotics or programming, will ultimately need to program and “re-task” their robots to perform a variety of custom tasks. <br><br>
PATI allows users to use simple, common gestures (e.g., pinch gestures) and tools (e.g., shape tools) to specify table-top manipulation tasks (e.g., pick-and-place) for a robot manipulator. It further enables users to interact with the environment directly when providing task specifications; for example, users can utilize gestures and tools to annotate the environment with task-relevant information, such as specifying target landmarks and selecting objects of interest.
</p>
</div>
</section>
</article>
<!-- Footer -->
<footer id="footer">
<ul class="icons">
<li><a href="https://twitter.com/chienming_huang" class="icon fa-twitter"><span class="label">Twitter</span></a></li>
<li><a href="mailto:[email protected]" class="icon fa-envelope-o"><span class="label">Email</span></a></li>
</ul>
<ul class="copyright">
<li>© Intuitive Computing Laboratory</li>
<li>Design: <a href="http://html5up.net">HTML5 UP</a></li>
</ul>
</footer>
</div>
<!-- Scripts -->
<script src="assets/js/jquery.min.js"></script>
<script src="assets/js/jquery.scrollex.min.js"></script>
<script src="assets/js/jquery.scrolly.min.js"></script>
<script src="assets/js/browser.min.js"></script>
<script src="assets/js/breakpoints.min.js"></script>
<script src="assets/js/util.js"></script>
<script src="assets/js/main.js"></script>
</body>
</html>