VIP Invitation to Autodesk Forge Happy Hour
Get the party started with #ForgeDevCon! #ForgeDevCon Official Pre-Party
We wanted to extend an invitation to #ForgeDevCon Official Pre-Party Tuesday, June 14, from 6:00 – 9:00 PM at Hawthorn (46 Geary St., SF). This event is a joint affair with Autodesk to kick off the Autodesk Forge Developer’s Conference.
Forge DevCon, if you don’t know already, is a first ever developer’s conference for Autodesk and their partners. This event will unveil their new free suite of web development tools and solutions that companies built from them. It is like getting open-sourced tech from Autodesk.
Here is a sneak peek at what you can see at Autodesk Forge DevCon:
- Chris Anderson, CEO of 3D Robotics (3DR) and former Wired Magazine editor, will be keynoting. His team at 3DR is developing drone technology using Forge API to develop a reality capture solution that allows field professionals to quickly and easily perform inspections, surveys and scans of work sites from its unmanned aerial vehicles (UAVs). For example, take telecom inspections that involve climbing tall towers. 3DR’s SOLO drone will capture data, then process it in the cloud using Forge APIp, eliminating the need for personnel to risk injury.
- Explore the 3D Webfest as it brings live visual artists on stage as they create, manipulate and delight extraordinary three-dimensional designs. Your special ticket price includes this event, which will be held on June 15 at Fort Mason.
- 50+ workshops and classes on product design, manufacturing, 3D, VR/AR, IoT and AEC APIs and SDKs to help you create solutions with design and engineering data.
See you at Forge DevCon!
Via Community Evangelist . Autodesk Forge
The European Council for Nuclear Research (CERN) has made 300TB of data collected from its Large Hadron Collider (LHC) available to the public. The LHC is the largest particle collider and machine in the world. It is used to test theories of physics as well as properties of the recently discovered Higgs-boson.
“Primary” and “derived” datasets are available for download on the Compact Muon Solenoid (CMS) Open Data website. The primary dataset is in the same format used by the organization. The derived dataset is formatted to be easier to use by the public. The Open Data website also provides tools to analyze the data.
The rational behind the data release is explained on CERN’s news release:
These data are being made public in accordance with CMS’s commitment to long-term data preservation and as part of the collaboration’s open-data policy. “Members of the CMS Collaboration put in lots of effort and thousands of person-hours each of service work in order to operate the CMS detector and collect these research data for our analysis,” explains Kati Lassila-Perini, a CMS physicist who leads these data-preservation efforts. “However, once we’ve exhausted our exploration of the data, we see no reason not to make them available publicly. The benefits are numerous, from inspiring high-school students to the training of the particle physicists of tomorrow. And personally, as CMS’s data-preservation co-ordinator, this is a crucial part of ensuring the long-term availability of our research data.”
Read CERN’s news release on its website.
Image credits: CERN
Facebook announced a new video capture system named Facebook Surround 360 on April 12. The hardware and software video solution provides a professional, end-to-end system that automates the normally complex process of producing 360° panoramic video for 3D Virtual Reality (VR) use.
Like other 360° systems, the Surround 360 consists of an array of video cameras pointing in every possible direction. However, many basic 360° camera systems are “monoscopic” and appear flat because they lack depth. The Surround 360 is a “stereoscopic” system that uses two cameras to simulate the left and right eyes when capturing each location in a scene. This allows for a believable 3D experience because of the increased depth perception to the user.
Brian Cabra, Director of Engineering at Facebook, cited one of the most critical parts of the system to be the stitching software. Stitching video refers to the process of merging video from the various 10-20 cameras pointing in different directions to form a complete picture that appears continuous and as undistorted as possible when viewed through VR hardware. Typically, the this stitching process is done by hand and is labor intensive. The Surround 360 does this stitching process automatically through a series of imaging algorithms.
Another important feature of Facebook’s 3D-360° video capture system is that the camera and the software will be open-sourced to outside developers. Competing video capture systems tend to be proprietary, have very limited availability or be generally unreliable. According to Cabra, open-sourcing the system will:
“Accelerate the growth of the 3D-360 ecosystem… We want others to join us in refining this technology. We know from experience that a broader community can move things forward faster than we can on our own. All the software and hardware blueprints will be made available by the end of this summer. Make it faster, cheaper, smaller. Make it better. We’ll be working to do the same here. We can’t wait to see what you develop and, ultimately, what you create with it.”
The Facebook Surround 360 can produce up to 8K video and works with Gear VR. It is production-ready and is rugged and reliable. The cameras can operate for a long time without overheating.
View Facebook’s promotional video on the technology below:
Source: Facebook. Read Brian Cabra, Director of Engineering at Facebook’s announcement.
Ever wonder what happens each minute on the Internet? How about 2.4 million Google search queries, almost 350,000 tweets on Twitter, and over 700,000 Facebook logins. This is according to Excelacom, Inc.
Check out the fascinating graphic to see the most useful apps and sites on the Internet!
Mr. A: “The data seems to miss all Chinese apps and sites. Where’s WeChat for example? It’s much bigger than most of these.”
Mr. B: “It’s because there’s the world Internet and the China Internet.”
Source: Excelacom, Inc.
John Rogers and colleagues at the University of Illinois at Urbana-Champaign have invented a new wearable patch that can measure blood flow. It works by subtly heating the skin and monitoring the heat as it moves through the blood stream. Compared to other similar technologies, the new wearable patch is simpler in design and would be cheaper to produce.
According to an article on MIT Technology Review:
The inventors of the new “epidermal electronic” sensor system say it is ready for use in a clinical setting, specifically for monitoring skin health, for example in patients who have recently had skin grafts. They say down the road it may also be possible to use it inside the body. In a recent demonstration, the researchers showed that the device can record accurate data from human subjects about the flow of blood in larger vessels, specifically veins in the forearm, as well as in the network of tiny vessels near the surface of the skin.
L’Oreal helped fund the research and is producing the wearable device as well as analytical software for it.
Read more about it on MIT Technology Review.