Categories
_ Knowledge Sharing News Partnership

CLOUD AFFECTS | WITH PHILIP SAMARTZIS & ROLAND SNOOKS


Cloud Affects, insitu, Shenzhen Biennale. Photo: RMIT University

Cloud Affects is a large-scale architectural installation by Associate Professor Roland Snooks, Chief Investigator, Design Robotics, and Associate Professor Philip Samartzis, sound artist. Crafted using algorithmic generative design and robot-assisted additive manufacturing, this work explores the impact of cloud computing. Often thought of as immaterial and benign, the cloud is, in fact, a vast ecosystem of over 40 billion devices, powered by a network of energy-hungry data centres, which will consume as much as twenty percent of the earth’s energy generation by 2025. This novel research outcome operated as an agent for meaningful public engagement, as well as an exemplar of the structural potential of 3D printed assemblages.

Roland_Snooks_3D_Assemblage
Robotic-assisted 3D Assemblage, Urban Art Projects, Brisbane
agentBody Algorithms & Topological Complexity

Snooks and Laura Harper, Roland Snooks Studio, explain in their paper, “Printed Assemblages: A Co-Evolution of Composite Techtonics and Additive Manufacturing Techniques” (FABRICATE 2020), how Cloud Affect was designed using an agentBody algorithm. This behavioural formation process combined form, structure and ornament into topologically-complex lattices and surfaces. These architectural behaviours establish local relationships between material elements. Such interaction is driven by direct criteria, like structural or programmatic requirements, or more esoteric concerns relating to the generation of form or pattern.
Snooks and Harper explain the evolution of this process:
“This methodology, which has been in development since 2002, draws on the logic of swarm intelligence and operates through multi-agent algorithms (Snooks, 2020). Swarm intelligence describes the collective behaviour of decentralised systems, in which the non-linear interaction of its constituent parts self-organise to generate emergent behaviour (Bonabeau et al., 1999). Repositioning this logic as an architectural design process involves encoding architectural design intention within computational agents. It is the interaction of these agents that leads to a self-organisation of design intention and the generation of emergent architectural forms and organisational patterns.” (2020, p.204).

Installing Cloud Affect. Photo: RMIT University
Advanced Manufacturing Cloud Affects

Snooks and his team manifested their emergent form using carbon fibre and large-scale robot-assisted 3D-printing. Essentially, the internal lattice became a structural skeleton, containing a series of hollow formworks, enclosed in a second translucent skin. In addition, the inner and outer geometries were periodically laminated to ensure structural rigidity. Each joint was resolved by casting laser-cut steel plates into the carbon fibre. Certainly, the use of this technology increased quality, reduced risk, and resulted in more efficient workflows.
Cloud Affects demonstrates that structure is not subservient to the geometry of the skin (such as taping to inflatable or printed surfaces) or the convergence to physically efficient forms (such as minimal surfaces), but instead, structure and skin negotiate a nuanced interrelationship with the capacity to generate complex and intricate form. Given the limitations of the printing bed, the final work was designed a series of pre-fabricated components with the capacity to be disassembled. Snooks discusses this process in detail in Inside the Learning Factory: Architectural Robotics.
The final outcome draws complex data design and manufacturing processes into focus, questioning how viewers might feel about the most sophisticated technologies – software, AI, and algorithms – all powered by polluting carbon-based systems that contribute to Climate change. In contrast, the 3D printing process resulted in a form of digital craft akin to coiling in pottery or basketry, creating a tactile surface capable of refracting light and drawing viewers to the piece. This juxtaposition between tangible and intangible materials, technology and making, old and new processes, creates a powerful pause for thought.

Cloud Affects Assembly in process. Photo: RMIT University
Design Robotics & Futuremaking

This project attempted to reify a structure from the nebulous via a process of futuremaking: to materialise and express intangible algorithms and make real the energy required to prop up the virtual cloud. In manifesting the tangible, it sought to offer a new architectural geometric expression, one that can only emerge from the use of advanced computation within both the design and robotic fabrication processes.
Future cities will increasingly rely on advanced cloud computing, from simple algorithmic procedures to artificial intelligence, for their design, construction and infrastructural logistics. These cloud-based algorithms become the unseen structural framework behind the evolution of urbanism and architecture. Using technology to assess impact and evolve material outcomes inevitably evokes conversations beyond the realms of art, architecture and design.



This article is adapted from:
Samartzis, Philip “Cloud Affects” Bogong Sound, Bogong Centre for Sound, 30 March 2020, http://bogongsound.com.au/projects/cloud-affects. Accessed 20 Oct. 2020.
Snooks, Roland, and Laura Harper. “Printed Assemblages: A Co-Evolution of Composite Techtonics and Additive Manufacturing Techniques.” FABRICATE 2020: Making Resilient Architecture, by Jane Burry et al., UCL Press, London, 2020, pp. 202–209. JSTOR, www.jstor.org/stable/j.ctv13xpsvw.31. Accessed 19 Oct. 2020.

Categories
_ Knowledge Sharing Learn News

A LIFETIME OF SUMMERS | WITH NIKE SAVVAS & UAP

A Lifetime of Endless Summers from below

There is a dusting of jolly confetti falling gracefully from the ceiling of The Exchange, Sydney, the spiralling, light-filled hive, commissioned by Lendlease Australia, and designed by Kengo Kuma & Associates. A Lifetime of Endless Summers by renowned artist Nike Savvas, cascades in shades of yellow, orange, pink, green, and blue, capturing the wind, coaxing the harbour breeze indoors. In order to deliver this piece, in collaboration with Savvas, Urban Art Projects (UAP) experimented with interaction design using Augmented Reality (AR) and Virtual Reality (VR) technology.

The view from inside the HoloLens
Interaction Design (Wind)

The freedom to explore and experiment consistently drove this project forward, into new and unexpected territory, not least because this was a complex and varied piece. The artwork covers a 12-metre diametre and comprises 9,200 aluminium tabs finished in numerous fluorescent paint finishes. Each component was suspended via a system of 715 ultra-fine wire cables that fixed directly into the ceiling.
Once Savvas and Lendlease reached a consensus regarding the immersive experience, wind testing was employed at the UAP’s Brisbane foundry.  In fabrication, the team determined the precise spacing requirements. This involved regulating clear gaps to prevent individual wire drops from getting knotted and twisted. This kind of optimised precision enabled each wire drop to gently oscillate, delivering a range of sensations via an interplay between gentle breezes and the kinetic field of colour.
In production, the aluminium components were carefully designed and mounted to sway at random angles between an approximate range of 0-45 degrees. Each wire was placed at a minimum midpoint of 300 millimetres, with an extra 600-gram weight appended at the end to ensure just the right amount of gravity and sway.

AR & VR Solutions

The piece was successfully delivered using AR HoloLens headsets and Fologram VR mixed-reality software to manage the complexities of the installation on-site; a process that flawlessly encapsulates Savvas’ sense of playful ingenuity, and UAP’s commitment to delivering cutting-edge solutions built on a combination of value-added processes and technological innovation.
UAP also employed these tried and tested AR and VR technologies during the documentation and installation stage. This allowed the installation team to move freely, whilst skillfully navigating and visualizing each focal point via a direct overlay of digital elements amidst what already existed in the physical world.
Using Hologram and Fologram allowed UAP’s craft makers to execute the exact placement of the drill holes. The same holes were then carefully matched with the suspended wire drops and ceiling trays, which sat over-and-above a circular ceiling between the market hall and mezzanine restaurant. All those involved across the process remain extremely positive and enthusiastic about their experience and its impact on the outcome. Seamlessly combining AR and VR construction not only made for a safer work environment but saved days of time, opening up opportunities to integrate human creativity and intuition into the process.
Advanced manufacturing systems and technologies helped reduce the occurrence of human errors, which reduced the risks and costs traditionally involved in bespoke design and construction. As such, the use of Fologram and HoloLens delivered continuous engagement, and the opportunity to expand the scope of vision systems in design-led manufacturing.

Detail, confetti components
Delivering Bespoke Outcomes

As in many industries, technological advances and human artistry in manufacturing and design are converging. Whilst some fear that automation will kill jobs, Design Robotics and UAP recognise the important role technological advances play in supporting skilled workers. Human/robot interaction not only assists in the completion of tedious and repetitive tasks but also reduces risk. In this context, human partners are free to explore creative tasks, which has a direct impact on productivity and wellbeing.
Via the support of the Innovative Manufacturing Cooperative Research Centre (IMCRC), Design Robotics and UAP have partnered to present a range of new possibilities. The goal is simple – to design for human intelligence and optimize the relationship between people and machines. Watch this space as Design Robotics and UAP are committed to operating at the forefront of novel solutions, meshing technology with human creativity to explore a myriad of new possibilities.
A Lifetime of Summers launches a long-term commitment to robotic vision systems and software user-interfaces that enhance and support skilled workers. Associate Professor Dr. Glenda Caldwell, Cheif Investigator, Design Robotics described the process as “…the opportunity to work collaboratively with robotic technologies to decrease human risk in manufacturing and increase innovation and creativity”.
Reimagining the design process and pushing boundaries in industrial robotic capabilities empowers people to navigate increasing workplace complexity. At its heart, this work identifies what robots and machines do best – problem-solving, and matches it with what humans do best – social intelligence and contextual understanding. This symbiosis creates resilient outcomes, and enhanced processes, firmly placing Australia at the forefront of innovation and enterprise.
https://www.facebook.com/uapco/videos/2906429592742845/

Entering the artwork
The Concept of Freedom

Thanks to collaborative partnerships, like Design Robotics and UAP, embracing technology ensures value-added mass customization. With an eye on addressing logistical complexities, solving engineering challenges, and meeting tight deadlines. In this context, artists, like Savvas, can focus their attention on creative potential. This not only informs the work of the Design Robotics team but fosters a culture of cross-germination and skills acquisition, which impacts UAP’s crafts makers and the manufacturing sector Australia-wide, and internationally.
On one hand, A Lifetime of Summers is playful, teasing the vibrant kinesis between form, wind, and colour. Equally, it is profound in the pursuit of meaning. By simply standing beneath it, viewers are transported into a hypnotic trance, revelling and reflecting whilst charmed by a sense of freedom and the optimism of endless summers. Yet, few will appreciate the cutting-edge approaches that were applied in its making – that’s our little secret.

Categories
Learn

REVIEW | ROBOT CALIBRATION

UR10 Robot with burnishing tool

Robotic arms are a series of joints linked together in a kinematic chain

Calibration is critical in the field of robotics as it allows for enhancement of a robot’s accuracy through software rather than changing mechanical components. This article will go over some basic concepts involved in end effector calibration otherwise known as adding a new tool centre point on a robotic arm. Most industrial manipulators have six degrees of freedom or six joints (see Figures 1 & 2). The starting link always begins where the robot is physically mounted. These joints can be described as having a parent-child relationship. The hierarchy of these joints is important as the child joint is always defined in reference to, and therefore dependent on, the parent joint. The last link on the kinematic chain is typically referred to as the end effector which has a tool centre point (TCP). It is this TCP point that the user will manipulate in 3D-space, if in cartesian control. To make robotic arms useful, various end effectors (i.e. grippers, 3D sensors, rotating tools) can be attached in order to complete various operations. As a result, defining a new TCP is necessary to utilise the mounted tools.

Figure1: Simple drawing of a robotic arm and joints
Figure 1: Simple drawing of a robotic arm and joints

Figure 2: Example of joints on an industrial robotic arm

Figure 2: Example of joints on an industrial robotic arm

There are several ways to add a new TCP point on a robotic arm and most robotic arm manufacturers will provide their own methodology. Ultimately, all these methods involve measuring the pose of your new end effector in 3D-space, with respect to the last joint of the manipulator. The key feature to make note of when adding a new TCP, is the parent joint’s coordinate frame (see Figure 3).
Figure 3: Example of TCP point being defined from the last flange joint on a KUKA

Figure 3: Example of TCP point being defined from the last flange joint on a KUKA

To define a new TCP, the position and orientation is required to make up the pose. The position aspect can be gathered from physically measuring it out. It’s important to know the coordinate frame as this determines whether elements are positive or negative, and which axis to measure along. Depending on the complexity of the end effector, it can be quite difficult to measure the TCP point. If there is an accurate 3D model, the position information can be gathered from this, but ultimately the accuracy in robotic arm control is dependent on how close the most is representative in real life.
The orientation is crucial for cartesian control. The controller is given target poses and it is ultimately trying to match the robot’s end-effector coordinate frame to the target’s. If the orientation of the TCP is ill-defined, it can cause large sweeping motions. There are four key things to remember when defining a new orientation:

  1. Orientations are simply defining a new XYZ coordinate frame (see Figure 4A)
  2. All XYZ coordinate frames have to abide by a right-hand rule to be valid (see Figure 4B – this rule defines the order XYZ axes can exist, thumb is X, pointer is Y and middle is Z axis) and follow conventions to determine positive rotation (figure 4C)
  3. The order of rotations also impacts the ending result of a coordinate frame.
  4. All orientations, while maybe named or ordered differently, will be defined using either euler angles (x,y,z) or quaternions (x, y, z, w) with the units being either in degrees or radian.

Figure 4: (A) A 3D coordinate frame in cartesian space. (B) The right hand rule all frames will abide by. (C) The thumb represents the axis, and the curled fingers represent convention for positive rotation.

Figure 4A: (A) A 3D coordinate frame in cartesian space. (B) The right-hand rule all frames will abide by. (C) The thumb represents the axis, and the curled fingers represent convention for positive rotation.
Figure 4: Example of orientations defined in KUKA manipulators. ABC angles represent ZYX coordinate frames (note, reversed and named differently from conventional frames).
Figure 4B: Example of orientations defined in KUKA manipulators. ABC angles represent ZYX coordinate frames (note, reversed and named differently from conventional frames).

Figure 5: Coordinate frame showing the separated axis rotations

Figure 5: Coordinate frame showing the separated axis rotations

This article touched on the basic concepts involved in calibrating a new end effector on any kind of robotic arm. It is important to have an understanding of the underlying theory that underpins how robotic arms are structured, but the best resource to understand your robotic arm will be the manufacturer’s manual.

The Future of Manufacturing

With support from the Innovative Manufacturing Cooperative Research Centre (IMCRC), Design Robotics is collaborating to present a range of new fabrication and vision systems solutions. The goal is simple – to design for human intelligence and optimize the relationship between people and machines.
Pushing the limits of industrial robotics is a move to empower people. Navigating the increasing complexity of manufacturing inevitably supports human experience and enhances skills acquisition. At its heart, this approach celebrates the best of what robots and machines can achieve – problem-solving, and the best of what humans can do – social intelligence and contextual understanding.

Resources
Categories
Learn

REVIEW | OPEN INNOVATION

Open Innovation (OI) describes distributed and collaborative practices that amplify innovation. At its core, the practice is about opening up organisational boundaries to allow the exchange of knowledge with others.
An Open Innovation approach with Design Robotics: Knowledge sharing and product development

Design Robotics Open Innovation Pathway

There are two important kinds of open innovation: in-bound and out-bound. In-bound innovation is the process of incorporating external knowledge, while out-bound innovation describes the process of taking internal knowledge to the wider marketplace for others to develop and enhance.
OI approaches bring together diverse partners such as research institutes, industry and government to increase the speed and reduce the risks associated with innovating, particularly for smaller organisations. By providing SMEs access to advanced technologies and expertise, such collaborative arrangements facilitate the development of products and technologies with organisations often without the resources to manage such development on their own.
These collaborative arrangements have many labels – hubs, networks, clusters, accelerators, and incubators, for example. They can emerge organically, or be purposefully created, with national, regional, sectoral, or technological agendas. Silicon Valley is an iconic example of an innovation cluster. Ecosystems are another type of collaborative arrangement usually set up to encourage a large number of diverse organisations to think beyond their traditional supply chains. These large networks are often coordinated by a central platform leader or hub firm that, among other responsibilities, is tasked with managing direction and opportunities, and overseeing agreements and aspects of IP governance.
Opening up to outbound use of IP is one of the main challenges for organisations in adopting OI practices. Those most successful have managed opportunities in both inbound and outbound innovation – diversifying their product pipeline while developing absorptive capacity to leverage the resources of others within the network. Opening up knowledge accelerates research and development processes. It provides ways for organisations to address new and emerging challenges and creates opportunities for innovation not previously considered possible.
 

The Future of Manufacturing

With support from the Innovative Manufacturing Cooperative Research Centre (IMCRC), Design Robotics is collaborating to present a range of new fabrication and vision systems solutions. The goal is simple – to design for human intelligence and optimize the relationship between people and machines.
Pushing the limits of industrial robotics is a move to empower people. Navigating the increasing complexity of manufacturing inevitably supports human experience and enhances skills acquisition. At its heart, this approach celebrates the best of what robots and machines can achieve – problem-solving, and the best of what humans can do – social intelligence and contextual understanding.
 

Support Opportunities (Australia)

 

Resources

 

Tools & Activities
Categories
_ News

Smartgeometry Workshop and Conference

Research Fellow Dr Muge Belek Fialho Teixeira was selected to participate in a workshop at the Smartgeometry workshop hosted by the University of Toronto earlier this year. In this post, Muge reflects on the workshop and conference.

Smartgeometry was founded in 2001 and is now a biannual event.  It starts with four days of themed workshops followed by a two day conference. Smart Geometry (SG) workshops and conferences have been influential to many disciplines including architecture, design, engineering and mathematics. Originating as a collaboration between industry, researchers and academics, SG has always been a platform where innovative ideas become a reality, informing the potential needs of the disciplines towards a better future.
The workshops are called clusters and are organised around open calls coordinated by ‘cluster champions.’ Cluster champions are collaborative teams from academia and practice who get together to prepare a proposal, or a response, to a specific theme. SG’s open call encourages researchers, academics and industry to discuss possible research questions around the proposed theme and a research avenue, via a project. By working on this project, researchers and practitioners from industry and universities have a chance to see how these technologies can be applied. Participants for each of the clusters applied for a position via open calls with cluster briefs defined by cluster champions. Participants were selected, from a competitive, international pool of applicants, based on their background, research expertise and current interests.

The conference, which took place after the workshops furthered discussions around the workshop themes informed by different perspectives from multidisciplinary invited keynote speakers. The conference was curated in a way that would feed back into the outcomes from the workshops. In that manner, it was a dynamic conference, where the keynote speakers build on the work produced by the clusters and open up new agendas for future speculations. The conference was followed by Q&A sessions that allowed the workshop participants to engage with the keynote speakers openly. These exchanges also provided opportunities for future collaborations.
The University of Toronto hosted Smatgeometry under the theme “Machine Minds”, which revolved around machine learning and AI (Artificial Intelligence). Current discussions on machine learning and AI, generally consist of depressing scenarios of humans coming to an end or humans losing their jobs. Websites like “Will robots take my job?” are opening up discussions about how we should give away our passions for our professions. As a trending topic for many disciplines, SG focused on how machine learning and AI can be utilised for design and what could be some other positive and constructive ways of approaching this topic. The clusters explored the applicable areas of Machine Learning and AI, whereas the keynote speakers of the conference tried to create an understanding of what is machine learning and AI and its impact on our society, as well as the methods they use them in their practice.
The clusters at SG were:
–          Smart materials (Fibrous timber joints, Materials as probes)
–          Smart geometries (AI strategies for space frame design, Mind ex-machine)
–          Smart fabrication methodologies (Soft Office)
–          Smart and innovative ways of perceiving the environment (Behavioural Enviro[NN]ments, Data Mining the City, Fresh Eyes, Inside the Black Box, Sound and Signal)
All of them used cutting-edge technologies and customized software to define geometries. These technologies included interactive tables, VR headsets, industrial robots, mobile robots, CNC routers, sensors, microphones, and many more. One of the most dominant software platforms used by clusters was Rhino with the Grasshopper plug in, as a unifying platform, but there was also other software such as Unity, Processing, Arduino, Python, or custom build software for the clusters. More information on each of the clusters can be found here.
Highlights from conference discussions were;
–          what is AI and machine learning,
–          how AI and machine learning will affect the future of societies and how we can get prepared,
–          collecting, interpreting and managing data,
–          natural intelligence versus digital intelligence,
–          machine learning versus human learning,
–          robotics and advanced manufacturing,
–          interactive installations,
–          complex geometries.
The schedule and the keynote speakers can be found here.
As part of the SG2018 there was also a trip to see the new workplace of Autodesk Toronto. Autodesk has been a close collaborator of SG as a sponsor and providing know-how, keynote speakers, cluster champions and event participants. The new Autodesk workplace has been designed using generative algorithms and has a research centre for exploring new technologies. One of the clusters (Mind ex Machina) took place in this research centre, using two UR10 collaborative robotic arms with custom build open source software for SG18. It seems Autodesk has started to take a pioneering role in research by collaborating with research institutions, researchers and companies through these research centres. With artist-in-residency programs, they are opening up their facilities globally to makers and curious minds. A list of Autodesk research centres can be found here.
Looking forward to the future, next Smartgeometry will take place at Carnegie Mellon University in Pittsburgh, USA, 2020 with another challenging theme!