Search the Community

Showing results for tags 'science'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Favorites
  • General Discussion
    • Introduce Yourself
    • General Discussion
    • Jokes & Funny Stuff
    • Members' Metropolis
    • Real World News
  • Technical Help & Support
    • Hardware Hangout
    • Smart Home, Network & Security
    • Tips, Tweaks & Customization
    • Software Discussion & Support
    • Programming (C#, C++, JAVA, VB, .NET etc.)
    • Web Design & Development
  • Platforms (Operating Systems)
    • Microsoft (Windows)
    • Apple (macOS)
    • Linux
    • Android Support
  • Submitted News, Guides & Reviews
    • Essential Guides
    • Back Page News
    • Member Reviews
  • Recreational Activities
    • Gamers' Hangout
    • The Neobahn
    • The Media Room
    • The Sporting Arena
  • Neowin Services & Support
    • Site Announcements
    • Site & Forum Issues

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

  1. Just a place for neat stuff that will get lost too fast with a thread. --------------------------------------- Two designers just won $1.5 million for creating a device that can pull clean drinking water out of thin air A rendering of Skysource/Skywater Alliance's shipping container design for making drinking water out of air. The Skysource / Skywater Alliance How the system works... https://www.businessinsider.com/device-that-harvests-water-from-air-wins-xprize-2018-10?r=UK&IR=T -------------------------------------------- Inventor that harvested fresh water from the sky wins XPRIZE competition ----------------------------------------- GRAND PRIZE WINNER IN $1.75M WATER ABUNDANCE XPRIZE ANNOUNCED AT XPRIZE VISIONEERING 2018 https://www.xprize.org/articles/waxp-grand-prize-winner Official release.. Well done by all...
  2. Google Search and Lens add more features to solve math and science problems by Anushe Fawaz Google is introducing new functionalities to help solve math and science problems just by its Search feature. In its blog post, Google highlighted its latest features for solving math equations and word problems and getting interactive 3D diagrams for STEM-related topics. The company mentioned: “Whether you're delving into a math textbook or turning to Search to get more context on a complicated physics problem, it can sometimes be hard to describe exactly what you're looking for. Take that intricate biology concept or pesky geometry problem, for example. With new features across Search and Lens, you can now visualize STEM-related concepts and figure out which equation to use by browsing for them in a more natural and intuitive way.” The latest update to Google Search lets users type in their equations or integrals to get the solution to their questions. They can also scan the equation by taking a picture with Lens which will provide them with a step-by-step answer. Furthermore, if it is a geometry question, users can also use Lens to share the question at hand without needing to type it out. Otherwise, they can type “math solver” in the Search bar. While the feature is available on desktops, the mobile version will be coming out soon. Another additional utility Google has added is the word problem solver. Now, by typing in a word problem, Google will be able to share the solution owing to the advancements in its language model. The topics for the word problems would be starting from fundamental ones from high-school physics. Lastly, Google has added 3D interactive diagrams to Search alongside definitions and overviews of almost 1000 biology, chemistry, physics, astronomy, and related topics. The company claims that these 3D models help visualize complex concepts, for example, a user could see a 3D model of mitochondria to see the inner membranes or folds. Recently, Google also announced more features to its Maps app including Artificial Intelligence to bring more accurate search results. It is also using AI to update information regarding speed limits and EV charging stations, among other capabilities.
  3. NASA calls on the public to send their names on its Europa Clipper mission by Paul Hill Every now and again, NASA gets ready to launch a major mission into space and sometimes offers the public a chance to get their name sent into space. The organization has opened up another opportunity such as this. You now have the opportunity to send your name on NASA's upcoming Europa Clipper mission, which is set to land on Jupiter's second Galilean moon, Europa, in 2030. People’s names will accompany a poem called “In Praise of Mystery: A Poem for Europa” by US Poet Laureate Ada Limón. They will be stencilled onto a microchip and head off on the journey in October 2024. If you’re interested in including your name, you must add it to NASA’s website by December 31. Since the programme was announced on the evening of June 1, 2,767 people have added their names. The majority of these submissions are from the United States but names from all the continents have started coming in too. Within the United States, most signatures have come from California, Texas, Florida, and New York. "'Message in a Bottle' is the perfect convergence of science, art, and technology, and we are excited to share with the world the opportunity to be a part of Europa Clipper's journey," said Nicola Fox, associate administrator for NASA's Science Mission Directorate in Washington. "I just love the thought that our names will be travelling across our solar system aboard the radiation-tolerant spacecraft that seeks to unlock the secrets of Jupiter's frozen moon." As mentioned, NASA has held similar programmes for its Artemis I mission to the moon and for several missions to Mars. While anyone is free to add their name, NASA will definitely be hoping that it piques the interest of children who may be more interested to pursue a job in the sciences as a result. In addition to adding your name, the website provides a world map that displays the locations where signatures are being added from. You can also find a live feed of the Clipper clean room to see work going on.
  4. Dodo Park: A biosciences company is bringing back the dodo bird on the island of Mauritius by Dean Howell The flightless dodo bird, native to the island of Mauritius and symbol of man-made extinction, may not be the dummy you think it is. The bird, which was once abundant on the island, became extinct in the late 17th century due to overhunting and habitat destruction by Dutch settlers – not because it was stupid. Today, the dodo bird remains a glaring example of the price of carelessness and the need for conservation efforts. Colossal, the biosciences company behind an effort to bring the dodo back, is shining a light on the importance of restoring species that have been lost to extinction. With this in mind, Colossal have partnered with the government of Mauritius to establish a foundation for the de-extinction and rewilding of the beloved bird. Basically, Colossal have partnered with the local government to build a safe, kid-friendly version of Jurassic Park. However, there are challenges. Bringing the dodo back from extinction requires a reliable reference genome. To address this, Colossal intends to create high-quality avian genomes that can be used for multi-genome alignments, comparative analysis, and phenotype predictions that will guide the genome editing process. Remember how they used frog DNA in Jurassic Park to complete the genome sequence for the dinosaurs? Yes, life truly does imitate art. Their approach will involve ‘broad screening’ to optimize the growth parameters of the Nicobar pigeon's germ cells, which will serve as the host cells/genome for engineering. Frogs. In addition to the reference genome, Colossal will work to advance avian reproduction by demonstrating interspecies germline transfer of pigeon germ cells into surrogate chickens. This will enable the company to produce the Dodo much faster. In short, they will have chickens laying dodo eggs… They will also use machine learning to create more efficient editing tools that can shorten the timeline for success. Lastly, Colossal will work to improve multiplex genome editing for higher throughput engineering and the pursuit of more distantly related species de-extinctions. We think this means they will try to recreate the island as it was before the Dutch settlers arrives and wiped the bird out in the first place. Scientists, leave your take in the comments. The goal of Colossal is to use the dodo as a beacon of hope for other conservation efforts. They believe that the revival of the dodo will be an inspiration and a testament to the power of science and technology to bring species back from the brink of extinction. And presumably, to help sell tickets to “Dodo Park”. The revival of the dodo bird is a complex and challenging task, but Colossal is pushing the ball forward. They have partnered with the island Mauritius to establish a foundation for the de-extinction and rewilding of that dumb ‘ol bird we all love so much. With a reliable reference genome, advances in avian reproduction, and machine learning, Colossal believes it can bring the dodo back from extinction and establish a trend for the development and rewilding of other species. Source : Colossal.com
  5. Scientists at Salesforce develop proteins with AI that can eat trash by Dean Howell What do you get when the world's largest CRM breaks into the research industry and leverages AI to build their products? You get ProGen, a new AI system that can make artificial enzymes from scratch that can work just as well as real ones found in nature. ProGen was made by Salesforce Research (yes, that Salesforce) and uses language processing to learn about biology. In short, ProGen takes amino acid sequences and turns them into proteins. In 1999, biologist Günter Blobel won the Nobel Prize for his work in protein synthesis, but this new AI powered tech may already be outpacing it. ProGen speeds up the creation of new proteins, which can be used for many things like medicines or breaking down plastic in landfills, presumably aiding us in avoiding the looming Great Garbage Avalanche of 2505. "The artificial designs are better than ones made by the normal process," said James Fraser, a scientist involved in the project. "We can now make specific types of enzymes, like ones that work well in hot temperatures or acid." To make ProGen, the scientists at Salesforce fed the system amino acid sequences from 280 million different proteins. The AI system quickly made a staggering one million protein sequences, of which 100 were picked to test. Out of these, five were made into actual proteins and tested in cells. That's just 0.0005% of the generated results! It seems like the next frontier is to develop an AI to test all the possibilities. Two of the artificial enzymes were just as good at breaking down bacteria as the natural enzymes found in egg whites. Even still, the two were only 18% alike. ProGen was made in 2020 using an LLM originally made for writing text, similar to ChatGPT. The AI system learned the rules and structure of proteins by looking at a lot of data. With proteins, there are a tremendous number of possibilities, but ProGen can still make working enzymes, even when there is a wide variation among the results. "This is a new tool for protein engineers and we're excited to see what it can be used for," said Ali Madani, a scientist involved in the project. This project seems incredibly valuable, and must have cost Salesforce a fortune to get going, so we're surprised to see that the code for ProGen is available on Github for anyone who wants to try it (or add to it). Source: Salesforce via New Scientist
  6. New research shows that near-term quantum computers can learn to reason by Ather Fawaz The applications and development of quantum computers have steadily picked up pace in the last few years. We've seen researchers applying this novel method of computation in a variety of domains including quantum chemistry, fluid dynamics research, open problems, and even machine learning, all with promising results. Continuing this trend, UK-based startup Cambridge Quantum Computing (CQC) has now demonstrated that quantum computers "can learn to reason". Confusing at first, this claim is based upon new research coming out of CQC. Dr. Mattia Fiorentini, Head of Quantum Machine Learning at the firm, and his team of researchers investigated using quantum computers for Variational Inference. Variational Inference is a process through which we approximate a given probability distribution using stochastic optimization and other learning techniques. Jargon aside, this means a quantum computer outputs potential solutions to inferential questions such as that given the grass is wet and it's cloudy, what's the more probable cause for it? Rain or sprinklers? Formally, the question is posed as follows: “What is the state of the unobserved variables given observed data?” These outputs can then be used in downstream tasks such as finding the likeliest reason given the available data or predicting future outcomes and their confidence. The team's work, titled Variational inference with a quantum computer, has been published on the pre-print repository arXiv and highlights what the firm believes to be a promising indicator that quantum computers are great for Variational Inference, and by extension, at reasoning. Outputs from a quantum computer appear random. However, we can program quantum computers to output random sequences with certain patterns. These patterns are discrete and can become so complex that classical computers cannot compute them in reasonable time. This is why quantum computers are natural tools for probabilistic machine learning tasks such as reasoning under uncertainty. In the paper, the researchers demonstrate their results on Bayesian Networks. Three different problem sets were tested. First, was the classic cloud-sprinkler-rain problem that was described above. Second, was the prediction of market regime switches (bull or bear) in a Hidden Markov Model of simulated financial time series. Third, was the task of inferring likely diseases in patients given some information about symptoms and risk factors. Using adversarial training and the kernelized Stein discrepancy, the details of both which can be found in the paper, the firm optimized a classical probabilistic classifier and a probabilistic quantum model, called Born machine, in tandem. Adversarial method Kernelized Stein discrepancy method Once trained, inference was carried out on the three problems defined earlier, both on a quantum simulator and on IBM Q's real quantum computers. In the truncated histograms shown below, the magenta bars represent the true probability distribution, blue bars indicate outputs from a quantum computing simulator, and grey bars indicate the output from real quantum hardware from IBM Q. The results on real quantum computer hardware are marred by noise and this causes slower convergence compared to the simulation. That is to be expected in the NISQ era, however. Truncated histogram of the posterior distribution for a hidden Markov model Histogram of the posterior distribution for a medical diagnosis task The probability distribution of the quantum simulator closely resembles the true probability distribution, indicating that the quantum algorithm has trained well, and that the firm's adversarial training and the kernelized Stein discrepancy methods are powerful algorithms for the intended purpose. We demonstrate the approach numerically using examples of Bayesian networks, and implement an experiment on an IBM quantum computer. Our techniques enable efficient variational inference with distributions beyond those that are efficiently representable on a classical computer. The firm believes that this is yet an indicator that "sampling from complex distributions is the most promising way towards a quantum advantage with today’s noisy quantum devices." And that its new inference methods "can incorporate domain expertise". Moving forward, the firm envisions "a combination with other machine learning tasks in generative modeling or natural language processing for a meaningful impact." Further details can be found in this blog post and the paper on arXiv. If you are interested, you can check out Dr. Mattia's interview on YouTube here.
  7. TWIRL 6: Blue Origin prepares for a human space flight by Paul Hill Welcome to This Week in Rocket Launches 6, it looks set to be a bit of an interesting week with Blue Origin planning to launch its New Shepard NS-15 mission which will prep the firm for a crewed flight next time around. We’ve also got launches from SpaceX, Rocket Lab, Roscosmos, China, and ExPace which will re-attempt a mission originally slated for last week. Monday, March 29 The first rocket we could see take off this week is ExPace’s Kuaizhou KZ-1A carrying the Jilin Gaofen 2D satellite that will join the Jilin 1 Earth observation constellation. The payload, also known as Jilin 28, is a 230 kg satellite that will be used to take photos of Earth from 535 km. The launch of this rocket is not set in stone but it could go up from Monday. SpaceX is also looking to carry out its Starship SN11 mission on Monday. The firm was aiming for a launch last Friday but it was ultimately scrubbed. Similarly to the last two test flights, SN11 will fly to 10 km before attempting to land. During the SN10 mission, Starship did land but it also managed to catch fire which caused the ship's destruction several minutes later. The SN11 mission will be streamed by SpaceX on its YouTube channel when Starship is ready to launch. For those just tuning in, it was scrubbed. They will try again on Monday! 🤓🚀 — Neowin (@NeowinFeed) March 26, 2021 Thursday, April 1 The first rocket that we could see launch on Thursday is Rocket Lab’s Electron. As part of its STP-27RM mission, Rocket Lab will launch an experimental payload for the U.S. Air Force. The Payload is a space weather instrument called Monolith and is part of the Space Test Program. It will demonstrate the ability of small satellites to support large aperture payloads. While the mission is eligible for launch on Thursday, it could launch later. Another mission that will take off from Thursday is the New Shepard NS-15. Blue Origin, who makes New Shepard, is the firm owned by Amazon’s Jeff Bezos. It will be using a booster rocket called Tail 4 which itself comes equipped with an improved BE-3PM engine. The CC 2.0-2 RSS First Step capsule has been upgraded for astronauts who will also come aboard and then leave again before the rocket launch – this will prepare them for NS-16 which will be a crewed mission. You’ll be able to find a stream of the event here. As with the other launches so far, this mission is also marked with a No Earlier Than (NET) tag so it may happen after Thursday. Below is a video from the NS-14 mission: The final launch on Thursday will come from Roscosmos who is launching a Soyuz 2.1b with the 4th Resurs-P satellite. The Resurs-P series of satellites are Earth observation satellites that are used by several Russian governmental agencies including Russia’s meteorological agency. The launch was delayed from November 2019 and November 2020. Friday, April 2 The final launch of the week will come from China. The Long March CZ-4C will carry the Gaofen 12-02 remote sensing satellite which will perform high-res Earth observation. The satellite has a sub-meter level resolution which is suited for urban planning, crop yield estimation, and disaster prevention. The mission is being launched for the China High-definition Earth Observation System (CHEOS).
  8. Lack of thrust and an ad hoc solution to SN8's explosion led to Starship SN10's fiery ending by Ather Fawaz Less than a week ago, SpaceX almost succeeded in completing a flawless test flight for the Starship SN10. And things were looking up until a few moments after a successful ascent and touchdown back to earth. Just like the SN8 and SN9 predecessors before it, SN10 too met a fiery, explosive ending. It was unclear why the rocket exploded, especially after it had touched down and remained stationary for close to a minute. SpaceX founder and CEO, Elon Musk, has now cleared up some confusion on that front. Musk tweeted that "the SN10 engine was low on thrust due (probably) to partial helium ingestion from the fuel header tank." And that the prototype plummeted towards the earth at an impact velocity of 10m/s that crushed some legs and parts of the skirt. SN10 engine was low on thrust due (probably) to partial helium ingestion from fuel header tank. Impact of 10m/s crushed legs & part of skirt. Multiple fixes in work for SN11. — Elon Musk (@elonmusk) March 9, 2021 As spotted by Engadget, Chris Bergin of NASA Spaceflight pointed out that the helium ingestion was caused by the pressurization system that was added to the CH4 tank to prevent what caused the SN8 to explode. Musk replied to Bergin stating that this was a reasonable point, and that "...if autogenous pressurization had been used, CH4 bubbles would most likely have reverted to liquid. Helium in header was used to prevent ullage collapse from slosh, which happened in prior flight. My fault for approving. Sounded good at the time." Fair point. If autogenous pressurization had been used, CH4 bubbles would most likely have reverted to liquid. Helium in header was used to prevent ullage collapse from slosh, which happened in prior flight. My fault for approving. Sounded good at the time. — Elon Musk (@elonmusk) March 9, 2021 Despite this, the SN10's touchdown represents a big step forward for the Starship program and SpaceX at large. The firm was quick on bringing the next prototype, the SN11, to the stand at Boca Chica, preparing for its test flight sometime later. As Austin Barnard photographed, the team of engineers on-site inspected every landing leg of the SN11. SpaceX hopes that the insight gained from its predecessors, including the SN10, will be used to do away with past mistakes and progress towards a successful test flight for the SN11.
  9. Apple shares Hearing Study insights for World Hearing Day by Paul Hill In time for World Hearing Day, which takes place globally on March 3, Apple has shared some of its findings from a study it is carrying out on the long-term auditory health of its users in the United States. Apple said the participation levels in this study are on a scale never seen before thanks to the ease of participation; users can partake in the study using an Apple Watch and iPhone and sharing their health data. One of the interesting findings revealed that 25% of participants experience environmental sounds higher than the World Health Organization’s (WHO) recommended limit. This noise can come from anywhere including traffic, machinery, and public transport. The WHO has estimated that 700 million people around the world will be affected by profound hearing loss by 2050 - with its Noise app on Apple Watch, the tech firm hopes to save some of its users from hearing loss. Another source of hearing loss is the way we use headphones to listen to music. One in ten of Apple’s participants listen to sounds through headphones at a level above WHO recommendations. Apple says that to protect hearing, users ought to listen to media at the lowest enjoyable volume rather than at full blast. The study seems to suggest that a lot of people are neglecting their ears; around 10% of participants have been diagnosed with hearing loss by a professional and over these, three-quarters do not use tools such as hearing aids or cochlear implants even though they can reduce the impact of hearing loss. Furthermore, Apple found that a huge 50% of participants hadn’t had their hearing tested by a professional in the last 10 years and a quarter of all participants reported a ringing in their ears which is a sign of hearing loss. To take part in the study yourself, head over to the App Store and download the Apple Research app. If you’re not sure how good your hearing is or lack access to testing facilities, the World Health Organization provides an app for Android and iOS called HearWHO where you have to listen out for numbers being read out in static, this can give you an estimate about how good your hearing is.
  10. Elon Musk claims that his wired-up Neuralink monkey is happy and enjoys playing video games by Ather Fawaz Image via The Telegraph Neuralink is Elon Musk's bold initiative to create an interface between a brain and a computer chip. Revealed back in 2017, the company was funded entirely out of Musk's own pocket. While many raised suspicions about the potential misuse of the technology, Musk insists that its raison d'etre is to cure medical injuries related to the spinal cord and brain and improve cognition and memory. Over the years, there have been a bunch of updates on the project. Perhaps most importantly, the chip was successfully implanted in a monkey, allowing it to control a computer with its brain. While the exact nature of this control wasn't entirely clear, Musk dubbed that result as "very positive." Building on this, Musk gave a couple of added details about Neuralink's tests on monkeys. Bloomberg reports that while speaking to Clubhouse, which is a private social app where users engage in informal conversations, Musk told several thousand listeners that his company has a happy monkey with Neuralink implanted in its skull who enjoys playing video games: “We have a monkey with a wireless implant in their skull with tiny wires who can play video games with his mind. You can’t see where the implant is and he’s a happy monkey. We have the nicest monkey facilities in the world. We want them to play mind-Pong with each other.” When Neuralink's working was first highlighted, many people raised a concern that implanting a device in the skull would not only be a notoriously difficult procedure, but it might not look all that cosmetically pleasing either. Speaking to Joe Rogan on his podcast, the billionaire clarified that the device would sit flush with the skull. He further added on Clubhouse today that: “There are primitive versions of this device with wires sticking out of your head, but it’s like a Fitbit in your skull with tiny wires that go into your brain,” Going back to his plugged-in monkeys, he further indicated that their videos could be released in about a month. As with the experiments before, the exact nature and details of the controllability of the computer are unclear. The videos could clear up the confusion, and for those, we shall be on the lookout. Source: Bloomberg
  11. Gooseberry: A dive into Microsoft's new quantum control chip to handle thousands of qubits by Ather Fawaz Image via Microsoft Research Quantum computers provide a promising new model of computation that enables exponential speedups over certain classical algorithms. But their Achilles' heel is a qubit's penchant for decoherence. That is, contemporary qubits are sensitive to changes in their environment and tend to lose their superposition because of it. Quantum superposition, as it turns out, is the central tenet of quantum computation and is vital for achieving the said exponential speedups. Researchers have been working towards making these qubits more robust to changes in the environment without losing their controllability. A common solution is keeping these qubits in cryogenic environments where temperatures are tantalizingly close to absolute zero (0K), but this mechanical setup becomes a significant limitation in scaling up quantum computers for commercial use-cases. As a result, this remains an open research problem. To this end, Microsoft in collaboration with a team from the University of Sydney has developed a cryogenic quantum control platform that uses specialized CMOS circuits to address the problem of qubit control and decoherence. In the paper "A Cryogenic Interface for Controlling Many Qubits", the researchers present Gooseberry, a CMOS chip that takes digital inputs and generates many parallel qubit control signals thereby allowing scaled-up support for thousands of qubits—a feat Microsoft deems a "leap ahead from previous technology". Gooseberry enables this by operating at 100mK while dissipating sufficiently low power so that it does not heat up the qubits themselves. This means that the entire setup does not exceed the cooling capacity of commercially available quantum computing refrigerators. The team also used Gooseberry to create what it is calling the novel general-purpose cryo-compute core. The proposed setup (shown above) uses a special breed of qubits called Topological Qubits. These qubits are more resilient to decoherence and have hardware-level error protection baked into them, reducing the overhead needed for software-level error correction and enabling meaningful computations to be done with fewer physical qubits. Taking a deeper look into the setup above, the Quantum-Classical interface layers are where the meat of the communication happens. Gooseberry sits abreast with the qubits in the lower stage due to its cryogenic requirements. It is thermally isolated from the qubits and its dissipated heat is drawn into a mixing chamber. Once ensconced near the qubits, Gooseberry converts classical instructions from the cryo-compute core into voltage signals which are then sent to the qubits. (Left) A simplified version of the thermal conductance model of the Gooseberry chip. (Right) Gooseberry chip (red) sits close to the qubit test chip (blue) and resonator chip (purple). Together the chips manage communication between various parts of a quantum computer. Essentially, they are used to send and receive information to and from every qubit, but in a way that maintains a stable cold environment, which is a significant challenge for a large-scale commercial system with tens of thousands of qubits or more. The stack itself operates at 2K, a temperature that is 20 times warmer than the temperature at which Gooseberry operates. This frees 400 times as much cooling power, allowing the stack itself to dissipate 400 times as much heat. Due to this, Microsoft believes that the stack is capable of general-purpose computing. Putting Gooseberry to the test, the researchers connected with it a GaAs-based quantum dot (QD) device. Temperature of the components of the chip were measured as the control chip was powered up. As expected, the temperatures remained below 100mK, within the necessary range of frequencies and clock speeds. These results were extrapolated, showing the total system power needed for Gooseberry as a function of frequency and the number of output gates. [Gooseberry] is able to operate within the acceptable limits while communicating with thousands of qubits. This CMOS-based control approach also appears feasible for qubit platforms based on electron spins or gatemons. Though at present the proposed core can only handle some data and triggering manipulation, temperature freedom opens vital room for more technologies and ideas to work with. This is a general-purpose CPU operating at cryogenic temperatures. At present, the core operates at approximately 2 K, and it handles some triggering manipulation and handling of data. With fewer limitations from temperature, it also deals with branching decision logic, which requires more digital circuit blocks and transistors than Gooseberry has The team at Microsoft and the researchers from the University of Sydney believe that Gooseberry and the bundled cryo-compute core are big steps forward in quantum computing. The cryo-compute core, acting as an interface between source code written by developers, Gooseberry, and qubits, shows that it’s possible to compile and run multiple types of code in a cryogenic environment, allowing for software-configurable communication between qubits and the outside world.
  12. NASA and Boeing are targeting March 25 for Starliner's second unmanned orbital flight test by Ather Fawaz Image via NASA Blogs Boeing and NASA have set March 25, 2021, as the date for Starliner's second unmanned flight test. Dubbed Orbital Flight Test- 2 (OFT-2), this will be the second major flight test for the spacecraft and a key developmental milestone for Boeing in its bid for the NASA Commercial Crew program. Previously, the two were targeting March 29, but the date was brought up due to multiple factors including the availability of the United Launch Alliance Atlas V rocket, an opening on the Eastern Range, steady progress on hardware and software, and a docking opportunity on the International Space Station. This announcement comes after Boeing completed the formal requalification of the Starliner's flight software for the upcoming mission. This test included a full software review to verify that Starliner’s software meets design specifications. A complete, end-to-end simulation of the OFT-2 test flight using flight hardware and software will be conducted prior to the test day as well. Recently, Boeing also mated the Starliner's reusable crew module on its new service module inside. Engineers are now working to complete outfitting of the vehicle’s interior before loading cargo and conducting final spacecraft checkouts. A series of parachute balloon drop tests were completed last year in December, as well, to gather supplemental performance data on the spacecraft’s parachutes and landing systems before a manned test is conducted sometime in the future. Image via NASA Blogs Starliner's last orbital flight test took place as far back as December 2019. But on that voyage, the spacecraft experienced a mission timing anomaly that caused it to burn too much fuel to reach the International Space Station (ISS). Consequently, it was put into a lower, stable orbit where the Starliner demonstrated effective key systems and capabilities before returning to Earth. When it touched down on December 22, it became the first American orbital space capsule to land on American soil rather than in an ocean.
  13. "Mars, here we come!!" exclaims Elon Musk despite explosive ending to Starship's test flight by Ather Fawaz Image via Trevor Mahlmann (YouTube) The Starship initiative by SpaceX is meant to make spaceflights to Mars a reality. After a scrubbed launch yesterday courtesy of an auto-abort procedure in the Starship's Raptor engines, once again, SpaceX geared up for a re-run of the test a few hours back. This time, Starship SN8 successfully took flight from its test site in Boca Chica, Texas. A trimmed version of the complete event is embedded below from Trevor Mahlmann's YouTube channel. Compared to the scrubbed launch, things went better on this one, but not entirely. The gargantuan 160-feet tall rocket, propelled by three Raptor engines, took flight, and intended to rise to a height of 41,000 ft (12,500 m). SpaceX founder Elon Musk called the ascent a success, but it's not clear whether the rocket reached its intended altitude. Nevertheless, after reaching its highest point, the rocket began its journey back to its earthly test site. Image via Trevor Mahlmann (YouTube) The SN8 prototype performed a spectacular mid-air flipping maneuver to set itself on course to land vertically back to the earth—a feat we've all grown accustomed to seeing with SpaceX's Falcon 9 rocket. The SN8 executed the landing flip successfully, and SpaceX tweeted a closer look at the event as it happened. Impressively, SpaceX claimed that by doing so, the SN8 became the largest spacecraft to perform a landing maneuver of this sort. Starship landing flip maneuver pic.twitter.com/QuD9HwZ9CX — SpaceX (@SpaceX) December 10, 2020 But as the rocket prepared to touch down and its boosters tried to slow down its descent to cushion the landing, the rocket's fuel header tank pressure got low. This caused the "touchdown velocity to be high & RUD," during the landing burn, Musk tweeted. Unfortunately, this meant that upon touchdown, the Starship SN8 prototype exploded into flames. Image via SpaceX Livestream Notwithstanding the fiery, unfortunate event right at the final few moments, SpaceX and Musk hailed the test as a success. For the company, "SN8 did great! Even reaching apogee would’ve been great, so controlling all way to putting the crater in the right spot was epic!!" Musk tweeted, "We got all the data we needed. Congrats SpaceX team hell yeah!!", he continued; before following up with another tweet exclaiming "Mars, here we come!!"
  14. Intel shows promising progress and key advances in integrated photonics for data centers by Ather Fawaz Image via Intel Press Kit The effective management, control, and scaling of electrical input/output (I/O) are crucial in data centers today. Innovative ideas like Microsoft's Project Natick, which submerged a complete data center underwater, and optical computing and photonics, which aim to use light as a basic energy source in a device and for transferring information. Building on this, at the Intel Labs Day 2020 conference today, Intel highlighted key advances in the fundamental technology building blocks that are a linchpin to the firm's integrated photonics research. These building blocks include light generation, amplification, detection, modulation, complementary metal-oxide-semiconductor (CMOS), all of which are essential to achieve integrated photonics. Among the first noteworthy updates, Intel showed off a prototype that featured tight coupling of photonics and CMOS technologies. This served as a proof-of-concept of future full integration of optical photonics with core compute silicon. Intel also highlighted micro-ring modulators that are 1000x smaller than contemporary components found in electronic devices today. This is particularly significant as the size and cost of conventional silicon modulators have been a substantial barrier to bringing optical technology onto server packages, which require the integration of hundreds of these devices. The key developments can be summarized as follows: Micro-ring modulators: Conventional silicon modulators take up too much area and are costly to place on IC packages. By developing micro-ring modulators, Intel has miniaturized the modulator by a factor of more than 1000, thereby eliminating a key barrier to integrating silicon photonics onto a compute package. All-silicon photodetector: For decades, the industry has believed silicon has virtually no light detection capability. Intel showcased research that proves otherwise. Lower cost is one of the main benefits of this breakthrough. Integrated semiconductor optical amplifier: As we focus on reducing total power consumption, integrated semiconductor optical amplifiers are an indispensable technology, made possible with the same material used for the integrated laser. Integrated multi-wavelength lasers: Using a technique called wavelength division multiplexing, separate wavelengths can be used from the same laser to convey more data in the same beam of light. This enables additional data to be transmitted over a single fiber, increasing bandwidth density. Integration: By tightly integrating silicon photonics and CMOS silicon through advanced packaging techniques, we can gain three benefits: (1) lower power (2) higher bandwidth and (3) reduced pin count. Intel is the only company that has demonstrated integrated multi-wavelength lasers and semiconductor optical amplifiers, all-silicon photodetectors, and micro-ring modulators on a single technology platform tightly integrated with CMOS silicon. This research breakthrough paves the path for scaling integrated photonics. These results point towards the extended use of silicon photonics beyond the upper layers of the network and onto future server packages. The firm also believes that it paves a path towards integrating photonics with low-cost, high-volume silicon, which can eventually power our data centers and networks with high-speed, low-latency links. Image via Intel Press Kit “We are approaching an I/O power wall and an I/O bandwidth gap that will dramatically hinder performance scaling", said James Jaussi, who is the Senior Principal Engineer and Director of the PHY Lab at Intel Labs. He signaled that the firm's "research on tightly integrating photonics with CMOS silicon can systematically eliminate barriers across cost, power, and size constraints to bring the transformative power of optical interconnects to server packages.”
  15. Intel Labs Day 2020: Robotics demonstrations and a next-gen neuromorphic chip on the horizon by Ather Fawaz Loihi, Intel’s neuromorphic research chip. Image via Intel Press Kit Neuromorphic computing, as the name implies, aims to emulate the human brain's neural structure for computation. It's a relatively recent idea and one of the radical takes on contemporary computer architectures today. Work on it has been gaining traction, and promising results have come up; as recently as June this year, a neuromorphic device was used to recreate a gray-scale image of Captain America’s shield. Alongside other notable announcements at Intel Labs Day 2020, the firm also gave us an update on the progress with its Intel Neuromorphic Research Community (INRC). The aim of the INRC is to expand the applications of neuromorphic computing in business use cases. This consortium, which originally came together in 2018 and includes some Fortune 500 and government members, has now been expanded to over 100 companies and academics with new additions like Lenovo, Logitech, Mercedes-Benz, and Prophesee. Moreover, Intel also highlighted some research results coming out of the INRC computed on the company’s neuromorphic research test chip, Loihi, at the virtual conference. Intel Nahuku board, each of which contains 8 to 32 Intel Loihi neuromorphic chips. Image via Intel Press Kit Researchers showcased two state-of-the-art neuromorphic robotics demonstrations. In the first demonstration by Intel and ETH Zurich, Loihi was seen adaptively controlling a horizon-tracking drone platform. It achieved closed-loop speeds up to 20kHz with 200µs of visual processing latency, a 1,000x gain in combined efficiency and speed compared to traditional solutions. In the second demonstration, the Italian Institute of Technology and Intel showed the operation of multiple cognitive functions like object recognition, spatial awareness, and real-time decision-making, all running together on Loihi in IIT’s iCub robot platform. Other updates highlighted in the conference include: Voice Command Recognition: Accenture tested the ability to recognize voice commands on Intel’s Loihi chip versus a standard graphics processing unit (GPU) and found Loihi not only achieved similar accuracy, it was up to 1,000 times more energy efficient and responded up to 200 milliseconds faster. Through the INRC, Mercedes-Benz is exploring how these results could apply to real-world use cases, such as adding new voice interaction commands to vehicles. Gesture Recognition: Traditional AI works well for crunching big data and recognizing patterns across thousands of examples, but it has a hard time learning subtle differences that change from person to person – like the gestures we use to communicate. Accenture and INRC partners are demonstrating tangible progress for utilizing Loihi’s self-learning capabilities to quickly learn and recognize individualized gestures. Processing input from a neuromorphic camera, Loihi can learn new gestures in just a few exposures. This could be applied to a variety of uses cases, such as interacting with smart products in the home or touchless displays in public spaces. Image Retrieval: Researchers from the retail industry evaluated Loihi for image-based product search applications. They found Loihi could generate image feature vectors over 3 times more energy efficiently than conventional central processing unit (CPU) and GPU solutions while maintaining the same level of accuracy. This work complements similarity search results from Intel’s Pohoiki Springs neuromorphic research system, published earlier this year, which showed Loihi’s ability to search feature vectors in million-image databases 24 times faster and with 30 times lower energy than a CPU. Optimization and Search: Intel and its partners have discovered that Loihi can solve optimization and search problems that over 1,000 times more efficiently and 100 times faster compared to traditional CPUs. Optimization problems – such as constraint satisfaction – provide potential value at the edge, such as enabling drones to plan and make complex navigation decisions in real time. The same problem type could also be scaled for complex data center workloads, assisting with tasks like train scheduling and logistics optimization. Robotics: Rutgers and TU Delft researchers published new demonstrations of robotic navigation and micro-drone control applications running on Loihi. TU Delft’s drone performed optic flow landings with an evolved, 35-neuron spiking network running at frequencies over 250 kilohertz. Rutgers found its Loihi solutions to require 75 times lower power than conventional mobile GPU implementations, without any loss in performance. In work published at the 2020 Conference on Robot Learning in November, Rutgers researchers found Loihi could successfully learn numerous OpenAI Gym tasks with equivalent high accuracy as a deep actor network, with 140 times lower energy consumption compared to a mobile GPU solution. Moving forward, Intel will be integrating the takeaways accrued from experiments over the last couple of years into the development of the second generation of its Loihi neuromorphic chip. While the technical details of the next-gen chip are still nebulous, Intel says that it is on the horizon and "will be coming soon".
  16. China launches Chang'e-5 mission to extract and bring lunar rock samples to Earth by Ather Fawaz Image via National Geographic China successfully launched its Chang'e-5 mission on Monday whereby it is sending a spacecraft to the Moon to collect rock samples. If everything goes according to plan, the lander portion of the spacecraft will touch down on the lunar surface by the end of this week and will have approximately 14 days—or the length of a single day on the satellite—to collect the samples and bring them back to Earth. The spacecraft took off from the Wenchang space site at Hainan Island in China on Monday. Unlike previous missions, China was open about live-streaming and consistently sharing information about the launch procedures. The entire event was live-streamed by Chinese state media without any delay, showing the growing confidence that the nation has in its space program. The mission is being hailed as the most ambitious program in China's space history. Not only will it be the first attempt at collecting lunar rock samples in over forty years, but it also sets the nation on course to become only the third country to bring pieces of the moon back to Earth, joining the ranks of the U.S. and Soviet Russia who each completed this feat with the Apollo Missions and the Luna robotic landings, respectively. China plans to land Chang'e-5 on the Mons Rümker, which is an isolated volcanic formation that is located in the northwest part of the Moon's near side. It's also much younger than the craters that the Apollo astronauts visited. Once there, the spacecraft is slated to retrieve more than four pounds of lunar samples. For contrast, the three successful Soviet Luna missions brought close to 0.625 pounds while NASA’s Apollo astronauts ferried 842 pounds of moon rock and soil back to the Earth. From liftoff to touchdown back to Earth, the entire mission is scheduled to take less than a month. China hopes that the successful completion of Chang’e-5 will be a stepping stone towards establishing an international lunar research station before colonizing the moon by the next decade. Source: The New York Times via Engadget
  17. James Webb Telescope completes environmental testing ahead of 2021 launch by Ather Fawaz NASA's James Webb Space Telescope is the world’s largest, most powerful, and complex space science telescope ever and is slated for launch next year. In the march to launch of this $9.8 billion venture, the telescope is undergoing the final stages of testing after completing assembly back in August. Yesterday, it successfully completed its environmental testing, indicating that that the telescope is ready to thrive in the harsh conditions that it will be subjected to once it's launched into space. Webb’s recent tests have validated that the fully assembled observatory will endure the deafening noise, and the jarring shakes, rattles, and vibrations that the observatory will experience during liftoff. Known as “acoustic” and “sine-vibration” testing NASA has worked carefully with its international partners to match Webb’s testing environment precisely to what Webb will experience both on launch day, and when operating in orbit. The environmental testing comprised of two stages in two separate facilities at the Northrop Grumman’s Space Park in Redondo Beach in California. In the first stage, the Webb was transported to the acoustic testing chamber where it was subjected to sound pressure levels above 140 decibels to mimic a rocket's ascent to space. Close to 600 individual channels of motion data were observed and recorded, and the test was marked as a success. In the second stage of the test, the telescope was transported to a separate facility to simulate low-frequency vibrations that occur during liftoff. This too was a success. The James Webb Telescope. Image via NASA While the telescope has been meticulously tested throughout its assembly and developmental stages, last night's tests, were a milestone achievement. This is because they are the last tests of their kind before the telescope will be ferried to South America for launch. However, the complete verification of flight worthiness will occur after the telescope has successfully completed final deployment tests. These should be around the corner as the completion of the environmental tests put Webb into the pipeline for a final systems evaluation before it receives a go for launch. Should the things go according to plan and weather permitting, the James Webb Telescope will take flight atop the Ariane V rocket from Europe's Spaceport in Kourou, French Guiana on October 31, next year.
  18. QCE20: Here's what you can expect from Intel's new quantum computing research this week by Ather Fawaz The IEEE Quantum Week (QCE20) is a conference where academics, newcomers, and enthusiasts alike come together to discuss new developments and challenges in the field of quantum computing and engineering. Due to COVID-19 restrictions, this year's conference will be held virtually, starting today and running till October 16. Throughout the course of the event, QCE20 will host parallel tracks of workshops, tutorials, keynotes, and networking sessions by industry front-runners like Intel, Microsoft, IBM, and Zapata. From the pack, today we’ll peek into what Intel has in store for the IEEE Quantum Week. Particularly, we’ll be previewing Intel’s array of new papers on developing commercial-grade quantum systems. Image via Intel Designing high-fidelity multi-qubit gates using deep reinforcement learning Starting off, Intel will be presenting a paper in which researchers have employed a deep learning framework to simulate and design high-fidelity multi-qubit gates for quantum dot qubit systems. This research is interesting because quantum dot silicon qubits can potentially improve the scalability of quantum computers due to their small size. This paper also indicates that machine learning is a powerful technique in optimizing the design and implementation of quantum gates. A similar insight was used by another team at the University of Melbourne back in March in which the researchers used machine learning to pinpoint the spatial locations of phosphorus atoms in a silicon lattice to design better quantum chips and subsequently reduce errors in computations. Efficient quantum circuits for accurate state preparation of smooth, differentiable functions Next up, Intel's second paper proposes an algorithm that optimizes the loading of certain classes of functions, e.g. Gaussian and Probability distributions, which are frequently used for mapping real-world problems to quantum computers. By loading data faster in a quantum computer and increasing throughput, the researchers believe that we can save time and leverage the exponential compute power offered by quantum computers in practical applications. Image via Intel On connectivity-dependent resource requirements for digital quantum simulation of d-level particles One of the earliest and most useful applications of quantum computers is to simulate a quantum system of particles. Consider the scenario where the ground state of a particle is to be calculated to study a certain chemical process. Traditionally, this task usually involves obtaining the lowest eigenvalue from the corresponding eigenvectors of the states of a particle represented by a matrix known as the Hamiltonian. But this deceptively simple task grows exponentially for larger systems that have innumerable particles. Naturally, researchers have devised quantum algorithms for it. Intel’s paper highlights the development and research requirements of running such algorithms on small qubit systems. The firm believes that the insight garnered from these findings can have potential implications for designing qubit chips in the future while simultaneously making quantum computing more accessible. A BIKE accelerator for post-quantum cryptography While we’re still in the NISQ (Noisy Intermediate-Scale Quantum) era of quantum computers, meaning that perfect quantum computers with thousands of qubits running Shor’s algorithm are still a thing of the future, firms have already started preparing for a ‘quantum-safe’ future. One of the foreseeable threats posed by quantum computers is the ease with which they can factor large numbers, and hence threaten to break our existing standards of encryption. In this paper, researchers at Intel have aimed to address this concern. By presenting a design for a BIKE (Bit-flipping Key Encapsulation) hardware accelerator, today’s cryptosystems can be made resilient to quantum attacks. Another thing to note here is that this approach is also currently under consideration by the National Institute of Standards and Technology (NIST), so a degree of adoption and standardization might be on the cards in the future. Engineering the cost function of a variational quantum algorithm for implementation on near-term devices Addressing the prevalent issues of the NISQ era once again, this paper debuts a novel technique that helps quantum-classical hybrid algorithms run efficiently on small qubit systems. This technique can be handy in this era since most practical uses of quantum computers involve a hybrid setup in which a quantum computer is paired with a classical computer. To illustrate, the aforementioned problem of finding the ground state of a quantum particle can be solved by a Variational-Quantum-Eigensolver (VQE), which uses both classical and quantum algorithms to estimate the lowest eigenvalue of a Hamiltonian. But running such hybrid algorithms is difficult. But the new method to engineer cost functions outlined in this paper could allow small qubit systems to run these algorithms efficiently. Image via Intel Finally, on the penultimate day of the conference, Dr. Anne Matsuura, the Director of Quantum Applications and Architecture at Intel Labs, will be delivering a keynote titled “Quantum Computing: A Scalable, Systems Approach”. In it, Dr. Matsuura will be underscoring Intel’s strategy of taking a systems-oriented, workload-driven view of quantum computing to commercialize quantum computers in the NISQ era: “Quantum computing is steadily transitioning from the physics lab into the domain of engineering as we prepare to focus on useful, nearer-term applications for this disruptive technology. Quantum research within Intel Labs is making solid advances in every layer of the quantum computing stack – from spin qubit hardware and cryo-CMOS technologies for qubit control to software and algorithms research that will put us on the path to a scalable quantum architecture for useful commercial applications. Taking this systems-level approach to quantum is critical in order to achieve quantum practicality.” The research works outlined above accentuate Intel’s efforts to develop useful applications that are ready to run on near-term, smaller qubit quantum machines. They also put the tech giant alongside the ranks of IBM and Zapata that are working on the commercialization of quantum computers as well.
  19. Researchers probe into RNA using deep learning to develop sensors for a COVID-19 diagnostic by Ather Fawaz A genome is a genetic blueprint that determines an organism's characteristics. Deoxyribonucleic acid (DNA), and usually in the case of viruses, Ribonucleic acid (RNA) are the building blocks of genomic sequences. And manipulating these nucleic acids directly can lead to tangible changes in the organism. As such, developments in genetic engineering focus on our ability to manipulate genomic sequences. But this is a daunting task. For example, precisely controlling a specific class of engineered RNA molecules called "toehold switches" can lend vital insight into cellular environments and potential diseases. However, previous experiments have shown that toehold switches are not tractable, many don't respond to modifications even though they have been engineered to produce the desired output in response to a given input based on known RNA folding rules. Considering this, two teams of researchers from the Wyss Institute at Harvard University and MIT have developed a set of machine learning algorithms that can improve this process. Specifically, they used deep learning to analyze a large volume of toehold switch sequences to accurately predict which toeholds perform their intended tasks reliably thereby allowing researchers to identify high-quality toeholds for their experiments. Their findings have been published in Nature in two separate papers today. With any machine learning problem, the first step is to collect domain-specific data to train the model on. The researchers collected a large dataset composed of toehold switch sequences. Alex Garruss, co-first author and a graduate student working at the Wyss stated: "We designed and synthesized a massive library of toehold switches, nearly 100,000 in total, by systematically sampling short trigger regions along the entire genomes of 23 viruses and 906 human transcription factors." Since there were two separate teams, the researchers tried their hands with two different techniques to approach the problem. The authors of the first paper decided to analyze toehold switches not as sequences of bases, but as 2D images of base-pair possibilities. This approach, called Visualizing Secondary Structure Saliency Maps, or VIS4Map, successfully identified physical elements of the toehold switches that influenced their performance, providing insight into RNA folding mechanisms that had not been discovered using traditional analysis techniques. After generating a data set of thousands of toehold switches, one team used a computer vision-based algorithm to analyze the toehold sequences as two-dimensional images, while the other team used natural language processing to interpret the sequences as "words" written in the "language" of RNA. Image via Wyss Institute at Harvard University Authors of the second paper created two different deep learning architectures that approached the challenge of identifying 'susceptible' toehold switches using orthogonal techniques. The first model was based on convolutional neural network (CNN) and multi-layer perceptron (MLP), that treated the toehold sequences as 1D images, or lines of nucleotide bases. Using an optimization technique called Sequence-based Toehold Optimization and Redesign Model (STORM), it identified patterns of bases and potential interactions between those bases to mark the toeholds of interest. The second architecture modeled the problem to the domain of natural language processing (NLP), treating each toehold sequence as a phrase consisting of patterns of words. The task was then to train a model to combine these words, or nucleotide bases, to make a coherent phrase. This model was integrated with the CNN-based model to create Nucleic Acid Speech (NuSpeak). This optimization technique redesigned the last nine nucleotides of a given toehold switch while keeping the remaining 21 nucleotides intact. This allowed for the creation of specialized toeholds that detect the presence of specific pathogenic RNA sequences and could be used to develop new diagnostic tests. By using both models sequentially, the researchers were able to predict which toehold sequences would produce high-quality sensors. Image via Wyss Institute at Harvard University To test both models, the researchers sensed fragments from SARS-CoV-2, the viral genome that causes COVID-19, using their optimized toehold switches. NuSpeak improved the sensors' performance by an average of 160%. On the other hand, STORM created better versions of four SARS-CoV-2 viral RNA sensors, improving their performance by up to 28 times. Apropos these impressive results, co-first author of the second paper, Katie Collins an MIT student at the Wyss Institute, stated: "A real benefit of the STORM and NuSpeak platforms is that they enable you to rapidly design and optimize synthetic biology components, as we showed with the development of toehold sensors for a COVID-19 diagnostic." Diogo Camacho, a corresponding author of the second paper and a Senior Bioinformatics Scientist and co-lead of the Predictive BioAnalytics Initiative at the Wyss Institute stated: “Perhaps the most important aspect of the tools we developed in these papers is that they are generalizable to other types of RNA-based sequences such as inducible promoters and naturally occurring riboswitches, and therefore can be applied to a wide range of problems and opportunities in biotechnology and medicine.” Moving forward, as Camacho envisioned, the teams are looking to generalize their algorithms to map them onto other problems in synthetic biology to potentially accelerate the development of biotechnology tools.
  20. Sycamore ran the largest chemical simulation performed on a quantum computer to date by Ather Fawaz Google’s Sycamore 54-qubit quantum computer. Image by Rocco Ceselin via Google AI Blog Computer simulations in chemistry are a powerful way of understanding and optimizing chemical processes. But as the size of the system under consideration grows, so does the computational complexity of modeling the quantum chemical equations. This is due to the fact that increasing the size of a system leads to exponential scaling in the number and statistics of quantum variables. As such, the exact solution of quantum chemical equations for all but the smallest systems remain intractable for modern classical computers. Considering this, quantum computers with their exponential computational power provide a way to encapsulate such systems. In a study published in Science today, the Google AI Quantum team sought to accelerate the current quantum chemistry simulation techniques. The researchers used the infamous 54-qubit quantum computer Sycamore, which achieved quantum supremacy last year, to run the largest chemical simulation performed on a quantum computer to date. Though the calculation focused on the Hartree-Fock approximation of a real chemical system, it was twice as large as previous chemistry calculations on a quantum computer, and contained ten times as many quantum gate operations. Part of the simulation involved modeling the isomerization of diazene and binding energies hydrogen chains: Here, we perform a series of quantum simulations of chemistry the largest of which involved a dozen qubits, 78 two-qubit gates, and 114 one-qubit gates. We model the binding energy of H6, H8, H10 and H12 chains as well as the isomerization of diazene. To achieve this, the team used a noise-robust variational quantum eigensolver (VQE) to directly simulate a chemical mechanism via a quantum algorithm. VQE was important because quantum calculations are prone to noise that causes inaccuracies in calculations. Essentially, the technique treats a quantum processor like a neural network and attempts to optimize the quantum circuit’s parameters by dynamically minimizing a cost function to account for errors during computation. Sycamore has 54-qubits and consists of over 140 individually tunable elements, each controlled with high-speed, analog electrical pulses. Achieving precise control over the whole device requires fine tuning more than 2,000 control parameters, and even small errors in these parameters can quickly add up to large errors in the total computation. The next challenge was to define a system that was able to control Sycamore. For this, the team used an automated framework that mapped this problem onto a graph with thousands of vertices whereby each vertex represented a physics experiment to determine a single unknown parameter. Traversing this graph takes us from basic priors about the device to a high-fidelity quantum processor, and can be done in less than a day. Ultimately, these techniques along with the algorithmic error mitigation enabled orders of magnitude reduction in the errors. Energy predictions of molecular geometries by the Hartree-Fock model simulated on 10 qubits of the Sycamore. Image via Google AI Blogs With this setup in place, Google's team not only ran the largest chemical simulation performed on a quantum computer to date but also provided a proof of concept that the proposed method makes it possible to achieve chemical accuracy when VQE is combined with error mitigation strategies. Moving forward, the researchers hope their experiment serves as a blueprint for simulations on quantum processors in the future. The complete code for running this experiment has been posted on GitHub. Further details and the original research paper can be found here.
  21. Investment portfolio built with graph neural networks demonstrates promising results by Ather Fawaz Graph neural networks (GNNs) are a relatively recent development in the field of machine learning. Like traditional graphs, a core principle of GNNs is that they model the dependencies and relationships between nodes. A particularly nifty feature in this architecture is that unlike standard neural networks, GNNs retain a state that can represent information from its neighborhood with arbitrary depth. Building on this, Bloomberg recently released a white paper that leverages graph neural networks to construct an investment portfolio based on supply chain data. The researchers used Bloomberg's Supply Chain dataset, which contains data on over 100,000 companies, to develop an extension of the classical customer momentum strategy that was given by Cohen and Frazzini back in 2008. Since a given company in the dataset may be connected to multiple companies, we can consider the entire supply chain dataset as one gargantuan graph. In this graph, each company would be represented by a node while its supply chain relationships would be represented by directed edges between the nodes. Bloomberg took this idea and mapped it into a problem that could be solved via GNNs. And the intrinsic architecture of GNNs allowed the traditional analysis of supply chain data to be generalized in multiple ways. First, it allowed the inclusion of more features of customer companies by taking into account their market cap, volatility, turnover, and considering different lookback horizons. Similarly, GNNs were able to factor the propagation effects from downstream customer firms and also incorporated additional supply chain relationship features to develop an optimal weighting scheme. The end results showed graph neural network model generated a long-short portfolio that demonstrated "an improved Sharpe Ratio compared with the classical strategies and its alpha was still robust to a Fama/French five-factor attribution." This indicates the "usefulness of using a company's customer relationships from the supply chain dataset in conjunction with graph neural networks," Bloomberg wrote. Moving forward, the researchers believe that by modifying the underlying graph structure to support different edge types, the model can be extended to include more features like the suppliers of a company and the features of a target company. For companies that already have a traditional supply chain model in place, a GNN can be used to test whether the graph structure of the supply chain dataset might increase the model's predictive power.
  22. SpaceX reveals further details for its spaceport in Texas by Ather Fawaz Image via SpaceX(Twitter) Back in June, we received confirmation from SpaceX founder and CEO Elon Musk that the company is building a floating spaceport for space travel and hypersonic flights around the Earth. The floating spaceport would be built from refurbished oil platforms, will house a hyperloop for transportation to and from the land, and will be based in Boca Chica, Texas. Now, we have more details about the project. As spotted by Michael Sheetz (@thesheetztweetz) from CNBC, a new job posting by SpaceX for a 'Resort Development Manager' at Brownsville, Texas, deems the future spaceport at the Boca Chica Village in Texas as a "21st-century Spaceport” and the company’s first resort. Before enlisting the key responsibilities expected of a resort development manager, the company states: SpaceX is committed to developing revolutionary space technology, with the ultimate goal of enabling people to live on other planets. Boca Chica Village is our latest launch site dedicated to Starship, our next generation launch vehicle. SpaceX is committed to developing this town into a 21st century Spaceport. We are looking for a talented Resort Development Manager to oversee the development of SpaceX's first resort from inception to completion. Judging from this and the fact that Starship is now the major focal point of SpaceX's efforts, development and construction for the spaceport have picked up the pace. Just recently, the SN5 prototype for Starship completed a liftoff and landing sequence as part of a test. Though Musk stated that it will be at least a further two to three years before any complete test flights commence.
  23. Boeing will continue to support the ISS through 2024 under a new contract extension by Ather Fawaz Image via Jim Watson (AFP), Getty Images Back in 1993, the National Aeronautics and Space Administration (NASA) selected Boeing as the prime contractor for the International Space Station (ISS). Since then, the two companies have collaborated on advancing scientific research onboard the space station and human spaceflight in general. Now, Boeing announced that it will continue supporting the ISS through September of 2024 under a $916 million contract extension that was awarded today. Valued at about $225 million annually, under the contract, apart from managing the ISS' many stations, Boeing will provide resources, engineering support services, and personnel activities aboard the space station as well. Furthermore, the new contract can be extended beyond 2024. Apropos the contract extension, Boeing Vice President and Program Manager for the ISS, John Mulholland, commented: “As the International Space Station marks its 20th year of human habitation, Boeing continues to enhance the utility and livability of the orbiting lab we built for NASA decades ago. We thank NASA for their confidence in our team and the opportunity to support the agency’s vital work in spaceflight and deep-space exploration for the benefit of all humankind.” Boeing has also been involved in other initiatives in advancing human spaceflight. It is currently one of the two contenders for the NASA Commercial Crew program alongside SpaceX. Its CST-100 Starliner spacecraft is currently in development for the purpose of ferrying astronauts to the ISS. The firm is also building the core stage of NASA’s Space Launch System whereby it hopes to make space travel to lunar orbit and Mars a tangible reality.
  24. Researchers build a pencil-paper based biometric wearable that can monitor body signals by Ather Fawaz Photo by Miguel Á. Padriñán from Pexels Researchers hailing from the University of Missouri recently demonstrated the ability to build biometric wearables using a regular pencil and paper (via Engadget). These on-skin electronic devices can be used in monitoring a variety of local physical conditions, including biophysical (temperature, biopotential) sensors, sweat biochemical (pH, uric acid, glucose) sensors, thermal stimulators, and humidity energy harvesters. The modus operandi of the invention is as follows. We know that graphite is a good conductor of electricity due to the presence of a sea of free, de-localized electrons. So a pencil can be used to draw graphite patterns that can serve as conductive traces and sensing electrodes. On the other hand, a regular, ubiquitous office-paper can work as flexible supporting substrates for the device. Image via University of Missouri With this setup, the device performs real-time, continuous, and high-fidelity monitoring of a range of vital biophysical and biochemical signals from human bodies. These include instantaneous heart rates, respiratory rates, and sweat pH, uric acid, and glucose, skin temperatures, electrocardiograms, electromyograms, alpha, beta, and theta rhythms, among others. Here is a demonstration of the setup in action. The researchers noted that the quality of results is also comparable to those from conventional methods of measuring the same physical quantities. In the same study, pencil–paper-based antennas, 2D and 3D circuits with LEDs and batteries, reconfigurable assembly, and biodegradable electronics were also explored. Moving forward, the team wants to further test this low-cost, accessible pencil-paper based setup before it can commercialize it.
  25. Perseverance Rover's launch window pushed back to July 22 by Ather Fawaz Image via NASA JPL We're within touching distance of the commencement of Perseverance's voyage to Mars. But the launch has been pushed back a couple of days. In a recent update, NASA and United Launch Alliance announced that they are now targeting Wednesday, July 22 for the rover's launch. A two-hour window will be available at 9:35 AM ET at the Space Launch Complex-41 on Cape Canaveral Air Force Station. The initial plan was to launch the rover on July 17 but that was later delayed to July 20 due to ground system equipment issues. Now the latest update pushes the launch window back by two more days to July 22. According to NASA, the latest change in dates was caused by a "processing delay encountered during encapsulation activities of the spacecraft" due to the time "needed to resolve a contamination concern in the ground support lines in NASA’s Payload Hazardous Servicing Facility." The space agency clarified that the ATLAS V rocket and the rover are mission-fit. Perseverance is slated to touch down at the Jezero Crater on Mars somewhere in February 2021, which will make it the first spacecraft in the history of planetary exploration with the ability to accurately re-target its point of touch down during the landing sequence. Once there, the rover will begin exploring ancient habitable conditions, potential microbial life, and more on the red planet.