Science

Auto Added by WPeMatico

A Look into the Educational Impact of LOC Scientific Education

Some rules are not supposed to be broken if you want everything to take place seamlessly. Some of the places where you need to adhere to all the rules that have been put in place include a chemistry lab. The rules ensure that you’ll be safe at all times and you won’t incur any injuries.

Since LOC Scientific Education has partnered with different institutions while also donating lab equipment, we’ll look into their impact in the educational sector.

We’ll also look into some of the lab safety rules that you should adhere to according to LOC Scientific Education.

 

LOC Scientific Education.Photo: teachwire.net

⓵ Always Read Through the Safety Information Present on the Chemicals

There is always a safety data sheet that comes with each chemical to be used in the lab. Ensure you have read through all the recommendations such that you’ll manage to use the chemicals safely. The chemicals should also be disposed of accordingly.

⓶ You Should Not Pipette Using Your Mouth

You may assume that it’s safe to pipette water using your mouth. Well, the water may be clean; the only issue is, how clean is the pipette? Are the pipettes in use in the lab disposable? They may be disposable; however, there are many people who’ll clean them and use them again. Always use a pipette bulb.

⓷ Wear the Appropriate Clothing

You should not wear sandals while in the laboratory. Instead, you should wear closed shoes and long pants. If you have long hair, you should tie it back. Ensure you’re also wearing safety goggles and lab coats. Regardless of whether you’re clumsy or not, it is important to adhere to the safety regulations that have been put in place. It’s also advisable to always be an excellent example to other people.

⓸ You Should Not Sniff or Taste Chemicals

When you sniff the chemicals present in the lab, you should know that you’re exposing yourself to danger. You should be keen on the safety instructions. If the chemicals are supposed to be used while in a fume hood, you should adhere to such instructions. Also, you should not attempt to taste any of the chemicals.

⓹ Don’t Play the Role of Mad Scientist

You shouldn’t attempt to mix chemicals haphazardly. Always be keen on the order you are adding the chemicals. You should understand that the instructions should be followed to the letter. Even if the chemicals produce a safe product, you need to be careful. For instance, when you mix sodium hydroxide and hydrochloric acid, the end product is saltwater. However, you should know that the chemical reaction can break the glassware.

⓺ Never Dispose of Chemicals Down the Drain Casually

Some chemicals can be disposed of by being washed down the drain. However, others should be disposed of differently. If the chemicals are washed down the sink, you should wash them away thoroughly to avoid any unexpected chemical reactions.

About Automated Laboratory Monitoring

Technology has brought about some significant changes such that automation is possible. Currently, the laboratory environment has gained significantly from automation. With increased automation, there is improved repeatability and accuracy.

With automation, it is possible to get valuable insights to increase productivity and efficiency. Automation helps to eliminate manual checks that are time-consuming. Remote work is also supported by automation, and there is increased flexibility.

If you want to witness all the benefits that accrue from automation, a laboratory should have a monitoring system that works all through. Such systems should keep track of all the parameters in the facility, such as humidity and temperature.

Why Should Lab Monitoring be Automated?

When you monitor lab equipment and instruments, facility, and environmental parameters, it is possible to ensure the data quality has improved together with the workflow. The valuable samples should be kept safely. Nonetheless, when you want to monitor all the parameters in a lab setting, you should know that handling everything manually will be time-consuming.

Manual documentation and monitoring are insufficient if you want accurate information on different trends and sample conditions. When you keep track of things manually, you’ll experience a false sense of security. Additionally, when you rely on different discrete systems, you’ll have a data silo, and you’ll realize it’s insufficient. Also, you won’t get a clear picture of all the parameters that are relevant.

An automated monitoring system that is comprehensive will ensure the efficiency levels have improved. With an automated system, it is possible to eliminate manual documentation and checks. It will also be possible to get relevant notifications and alerts such that the skilled staff will focus on complex tasks. If the monitoring system is large enough, it is possible to accommodate all parameters into one platform while also ensuring you have access to predictive insights, which means you can get custom reports.

The monitoring systems come in handy when it comes to offering insights; however, you won’t understand the entire picture about the fluctuation. For instance, the freezer monitor can issue alerts about temperature fluctuations. The alerts usually come in handy; however, there is the need to understand the root cause of the fluctuations. For instance, is the compressor the main issue? Are the doors of the freezer opened frequently? Which other factors are involved?

With LOC Scientific Education, you understand the importance of adhering to different safety requirements while in the lab setting. LOC Scientific Education has also partnered with different institutions, which means the impact is quite significant and positive at the same time.

The post A Look into the Educational Impact of LOC Scientific Education appeared first on Dumb Little Man.

Laser-initiated fusion leads the way to safe, affordable clean energy

Siegfried Glenzer
Contributor

Siegfried Glenzer, a recipient of the Ernest Orlando Lawrence Award, is a professor and high-energy-density division director at Stanford’s SLAC National Accelerator Laboratory and a science adviser for nuclear fusion company Marvel Fusion.

The quest to make fusion power a reality recently took a massive step forward. The National Ignition Facility (NIF) at Lawrence Livermore National Laboratory announced the results of an experiment with an unprecedented high fusion yield. A single laser shot initiated reactions that released 1.3 megajoules of fusion yield energy with signatures of propagating nuclear burn.

Reaching this milestone indicates just how close fusion actually is to achieving power production. The latest results demonstrate the rapid pace of progress — especially as lasers are evolving at breathtaking speed.

Indeed, the laser is one of the most impactful technological inventions since the end of World War II. Finding widespread use in an incredibly diverse range of applications — including machining, precision surgery and consumer electronics — lasers are an essential part of everyday life. Few know, however, that lasers are also heralding an exciting and entirely new chapter in physics: enabling controlled nuclear fusion with positive energy gain.

After six decades of innovation, lasers are now assisting us in the urgent process of developing clean, dense and efficient fuels, which, in turn, are needed to help solve the world’s energy crisis through large-scale decarbonized energy production. The peak power attainable in a laser pulse has increased every decade by a factor of 1,000.

Physicists recently conducted a fusion experiment that produced 1,500 terawatts of power. For a short period of time, this generated four to five times more energy than what the whole world consumes at a given moment. In other words, we are already able to produce vast amounts of power. Now we also need to produce vast amounts of energy so as to offset the energy expended to drive the igniting lasers.

Beyond lasers, there are also considerable advances on the target side. The recent use of nanostructure targets allows for more efficient absorption of laser energies and ignition of the fuel. This has only been possible for a few years, but here, too, technological innovation is on a steep incline with tremendous advancement from year to year.

In the face of such progress, you may wonder what is still holding us back from making commercial fusion a reality.

There remain two significant challenges: First, we need to bring the pieces together and create an integrated process that satisfies all the physical and technoeconomic requirements. Second, we require sustainable levels of investment from private and public sources to do so. Generally speaking, the field of fusion is woefully underfunded. This is shocking given the potential of fusion, especially in comparison to other energy technologies.

Investments in clean energy amounted to more than $500 billion in 2020. The funds that go into fusion research and development are only a fraction of that. There are countless brilliant scientists working in the sector already, as well as eager students wishing to enter the field. And, of course, we have excellent government research labs. Collectively, researchers and students believe in the power and potential of controlled nuclear fusion. We should ensure financial support for their work to make this vision a reality.

What we need now is an expansion of public and private investment that does justice to the opportunity at hand. Such investments may have a longer time horizon, but their eventual impact is without parallel. I believe that net-energy gain is within reach in the next decade; commercialization, based on early prototypes, will follow in very short order.

But such timelines are heavily dependent on funding and the availability of resources. Considerable investment is being allocated to alternative energy sources — wind, solar, etc. — but fusion must have a place in the global energy equation. This is especially true as we approach the critical breakthrough moment.

If laser-driven nuclear fusion is perfected and commercialized, it has the potential to become the energy source of choice, displacing the many existing, less ideal energy sources. This is because fusion, if done correctly, offers energy that is in equal parts clean, safe and affordable. I am convinced that fusion power plants will eventually replace most conventional power plants and related large-scale energy infrastructure that are still so dominant today. There will be no need for coal or gas.

The ongoing optimization of the fusion process, which results in higher yields and lower costs, promises energy production at much below the current price point. At the limit, this corresponds to a source of unlimited energy. If you have unlimited energy, then you also have unlimited possibilities. What can you do with it? I foresee reversing climate change by taking out the carbon dioxide we have put into the atmosphere over the last 150 years.

With a future empowered by fusion technology, you would also be able to use energy to desalinate water, creating unlimited water resources that would have an enormous impact in arid and desert regions. All in all, fusion enables better societies, keeping them sustainable and clean rather than dependent on destructive, dirty energy sources and related infrastructures.

Through years of dedicated research at the SLAC National Accelerator Laboratory, the Lawrence Livermore National Laboratory and the National Ignition Facility, I was privileged to witness and lead the first inertial confinement fusion experiments. I saw the seed of something remarkable being planted and taking root. I have never been more excited than I am now to see the fruits of laser technology harvested for the empowerment and advancement of humankind.

My fellow scientists and students are committed to moving fusion from the realm of tangibility into that of reality, but this will require a level of trust and help. A small investment today will have a big impact toward providing a much needed, more welcome energy alternative in the global arena.

I am betting on the side of optimism and science, and I hope that others will have the courage to do so, too.

Seqera Labs grabs $5.5M to help sequence Covid-19 variants and other complex data problems

Bringing order and understanding to unstructured information located across disparate silos has been one of more significant breakthroughs of the big data era, and today a European startup that has built a platform to help with this challenge specifically in the area of life sciences — and has, notably, been used by labs to sequence and so far identify two major Covid-19 variants — is announcing some funding to continue building out its tools to a wider set of use cases, and to expand into North America.

Seqera Labs, a Barcelona-based data orchestration and workflow platform tailored to help scientists and engineers order and gain insights from cloud-based genomic data troves, as well as to tackle other life science applications that involve harnessing complex data from multiple locations, has raised $5.5 million in seed funding.

Talis Capital and Speedinvest co-led this round, with participation also from previous backer BoxOne Ventures and a grant from the Chan Zuckerberg Initiative, Mark Zuckerberg and Dr. Priscilla Chan’s effort to back open source software projects for science applications.

Seqera — a portmanteau of “sequence” and “era”, the age of sequencing data, basically — had previously raised less than $1 million, and quietly, it is already generating revenues, with five of the world’s biggest pharmaceutical companies part of its customer base, alongside biotech and other life sciences customers.

Seqera was spun out of the Centre for Genomic Regulation, a biomedical research center based out of Barcelona, where it was built as the commercial application of Nextflow, open-source workflow and data orchestration software originally created by the founders of Seqera, Evan Floden and Paolo Di Tommaso, at the CGR.

Floden, Seqera’s CEO, told TechCrunch that he and Di Tommaso were motivated to create Seqera in 2018 after seeing Nextflow gain a lot of traction in the life science community, and subsequently getting a lot of repeat requests for further customization and features. Both Nextflow and Seqera have seen a lot of usage: the Nextflow runtime has been downloaded over 2 million times, the company said, while Seqera’s commercial cloud offering has now processed more than 5 billion tasks.

The Covid-19 pandemic is a classic example of the acute challenge that Seqera (and by association Nextflow) aims to address in the scientific community. With Covid-19 outbreaks happening globally, each time a test for Covid-19 is processed in a lab, live genetic samples of the virus get collected. Taken together, these millions of tests represent a goldmine of information about the coronavirus and how it is mutating, and when and where it is doing so. For a new virus about which so little is understood and that is still persisting, that’s invaluable data.

So the problem is not if the data exists for better insights (it does); it is that it’s nearly impossible to use more legacy tools to view that data as a holistic body. It’s in too many places, and there is just too much of it, and it’s growing every day (and changing every day), which means that traditional approaches of porting data to a centralized location to run analytics on it just wouldn’t be efficient, and would cost a fortune to execute.

That is where Segera comes in. The company’s technology treats each source of data across different clouds as a salient pipeline which can be merged and analyzed as a single body, without that data ever leaving the boundaries of the infrastructure where it already exists. Customised to focus on genomic troves, scientists can then query that information for more insights. Seqera was central to the discovery of both the alpha and delta variants of the virus, and work is still ongoing as Covid-19 continues to hammer the globe.

Seqera is being used in other kinds of medical applications, such as in the realm of so-called “precision medicine.” This is emerging as a very big opportunity in complex fields like oncology: cancer mutates and behaves differently depending on many factors, including genetic differences of the patients themselves, which means that treatments are less effective if they are “one size fits all.”

Increasingly, we are seeing approaches that leverage machine learning and big data analytics to better understand individual cancers and how they develop for different populations, to subsequently create more personalized treatments, and Seqera comes into play as a way to sequence that kind of data.

This also highlights something else notable about the Seqera platform: it is used directly by the people who are analyzing the data — that is, the researchers and scientists themselves, without data specialists necessarily needing to get involved. This was a practical priority for the company, Floden told me, but nonetheless, it’s an interesting detail of how the platform is inadvertently part of that bigger trend of “no-code/low-code” software, designed to make highly technical processes usable by non-technical people.

It’s both the existing opportunity, and how Seqera might be applied in the future across other kinds of data that lives in the cloud, that makes it an interesting company, and it seems an interesting investment, too.

“Advancements in machine learning, and the proliferation of volumes and types of data, are leading to increasingly more applications of computer science in life sciences and biology,” said Kirill Tasilov, principal at Talis Capital, in a statement. “While this is incredibly exciting from a humanity perspective, it’s also skyrocketing the cost of experiments to sometimes millions of dollars per project as they become computer-heavy and complex to run. Nextflow is already a ubiquitous solution in this space and Seqera is driving those capabilities at an enterprise level – and in doing so, is bringing the entire life sciences industry into the modern age. We’re thrilled to be a part of Seqera’s journey.”

“With the explosion of biological data from cheap, commercial DNA sequencing, there is a pressing need to analyse increasingly growing and complex quantities of data,” added Arnaud Bakker, principal at Speedinvest. “Seqera’s open and cloud-first framework provides an advanced tooling kit allowing organisations to scale complex deployments of data analysis and enable data-driven life sciences solutions.”

Although medicine and life sciences are perhaps Seqera’s most obvious and timely applications today, the framework originally designed for genetics and biology can be applied to are a number of other areas: AI training, image analysis and astronomy are three early use cases, Floden said. Astronomy is perhaps very apt, since it seems that the sky is the limit.

“We think we are in the century of biology,” Floden said. “It’s the center of activity and it’s becoming data-centric, and we are here to build services around that.”

Seqera is not disclosing its valuation with this round.

$100M donation powers decade-long moonshot to create solar satellites that beam power to Earth

It sounds like a plan concocted by a supervillain, if that villain’s dastardly end was to provide cheap, clean power all over the world: launch a set of three-kilometer-wide solar arrays that beam the sun’s energy to the surface. Even the price tag seems gleaned from pop fiction: one hundred million dollars. But this is a real project at Caltech, funded for a nearly a decade largely by a single donor.

The Space-based Solar Power Project has been underway since at least 2013, when the first donation from Donald and Brigitte Bren came through. Donald Bren is the chairman of Irvine Company and on the Caltech board of trustees, and after hearing about the idea of space-based solar in Popular Science, he proposed to fund a research project at the university — and since then has given over $100M for the purpose. The source of the funds has been kept anonymous until this week, when Caltech made it public.

The idea emerges naturally from the current limitations of renewable energy. Solar power is ubiquitous on the surface, but of course highly dependent on the weather, season, and time of day. No solar panel, even in ideal circumstances, can work at full capacity all the time, and so the problem becomes one of transferring and storing energy in a smart grid. No solar panel on Earth, that is.

A solar panel in orbit, however, may be exposed to the full light of the sun nearly all the time, and with none of the reduction in its power that comes from that light passing through the planet’s protective atmosphere and magnetosphere.

The latest prototype created by the SSPP, which collects sunlight and transmits it over microwave frequency.

“This ambitious project is a transformative approach to large-scale solar energy harvesting for the Earth that overcomes this intermittency and the need for energy storage,” said SSPP researcher Harry Atwater in the Caltech release.

Of course, you would need to collect enough energy that it’s worth doing in the first place, and you need a way to beam that energy down to the surface in a way that doesn’t lose most of it to the aforementioned protective layers but also doesn’t fry anything passing through its path.

These fundamental questions have been looked at systematically for the last decade, and the team is clear that without Bren’s support, this project wouldn’t have been possible. Attempting to do the work while scrounging for grants and rotating through grad students might have prevented its being done at all, but the steady funding meant they could hire long-term researchers and overcome early obstacles that might have stymied them otherwise.

The group has produced dozens of published studies and prototypes (which you can peruse here), including the lightest solar collector-transmitter made by an order of magnitude, and is now on the verge of launching its first space-based test satellite.

“[Launch] is currently expected to be Q1 2023,” co-director of the project Ali Hajimiri told TechCrunch. “It involves several demonstrators for space verification of key technologies involved in the effort, namely, wireless power transfer at distance, lightweight flexible photovoltaics, and flexible deployable space structures.”

Diagram showing how tiles like the one above could be joined together to form strips, then spacecraft, then arrays of spacecraft.

These will be small-scale tests (about 6 feet across), but the vision is for something rather larger. Bigger than anything currently in space, in fact.

“The final system is envisioned to consist of multiple deployable modules in close formation flight and operating in synchronization with one another,” Hajimiri said. “Each module is several tens of meters on the side and the system can be build up by adding more modules over time.”

Image of how the final space solar installation could look, a kilometers-wide set of cells in orbit.

Image Credits: Caltech

Eventually the concept calls for a structure perhaps as large as 5-6 kilometers across. Don’t worry — it would be far enough out from Earth that you wouldn’t see a giant hexagon blocking out the stars. Power would be sent to receivers on the surface using directed, steerable microwave transmission. A few of these in orbit could beam power to any location on the planet full time.

Of course that is the vision, which is many, many years out if it is to take place at all. But don’t make the mistake of thinking of this as having that single ambitious, one might even say grandiose goal. The pursuit of this idea has produced advances in solar cells, flexible space-based structures, and wireless power transfer, each of which can be applied in other areas. The vision may be the stuff of science fiction, but the science is progressing in a very grounded way.

For his part, Bren seems to be happy just to advance the ball on what he considers an important task that might not otherwise have been attempted at all.

“I have been a student researching the possible applications of space-based solar energy for many years,” he told Caltech. “My interest in supporting the world-class scientists at Caltech is driven by my belief in harnessing the natural power of the sun for the benefit of everyone.”

We’ll check back with the SSPP ahead of launch.

The Business of Telemedicine is Growing – Here’s Why

The future of medicine is just a few taps away. Telehealth is less than 100 years old but is maturing at an unprecedented rate. Starting in the 1950s, closed-circuit television systems allowed communication between hospitals. In the 1970s, telehealth provided medical care to rural communities in Alaska. By the 1990s, videoconferencing for healthcare skyrocketed, leading to today. Even veterinary hospitals offer telehealth services. Virtual healthcare has a compound annual growth rate of 4.8%. Currently, telehealth is a $20+ billion-dollar industry that is expected to reach $186.5 billion by 2026.

In 2019, more than 75% of U.S.-based hospitals used video services to connect with their patients. COVID-19 has encouraged many to try telehealth for the first time. In just the first quarter of 2020, there were over 1,600,000 telehealth visits. Currently, 61% of Americans have had at least one telehealth appointment, a 3x increase since March 2020. Now that many have tried it, most Americans want telehealth to continue. 80% believe telehealth offers the same quality of care as in-person visits, which is up from 56% before the pandemic.

What is Telemedicine?

Telemedicine offers remote clinical services and is used to diagnose conditions, screen symptoms, offer low-risk urgent care, deliver specialist consultations, and provide mental health services. Telehealth provides remote non-clinical services and is much broader, including fulfilling medications, chronic condition support, and physical and occupation services. Telehealth is booming thanks to its many formats. Various virtual healthcare platforms include video calls, mobile health, remote patient monitoring, texting services, software such as Nurx or BetterHelp, and phone calls.

In addition to its rising popularity, the barriers to telehealth are being removed. Public skepticism has gone down; the vast majority of studies show patients now prefer telehealth over in-person visits. Patient privacy was a concern for 66% of adults, but consumers are growing more and more comfortable with their private medical records in the cloud. There were concerns of misdiagnoses, but studies show no significant difference between in-person and telehealth diagnoses. Additionally, 41% of patients have limited access to the internet, but this problem is being solved with Federal broadband initiatives.

What is Telehealth?

Telehealth implements many new technologies. Apps and smartphone gadgets such as MedWand, a diagnostic tool, and Headspace, a mental health product, are new apps that contribute to overall wellness. Mail lin labs for allergies, food sensitivities, genetic testing, and testing for COVID-19 is another technology available. Wearable devices such as sensor-embedded clothing and smartwatches can also help patients track their medical needs.

Telehealth is helping those in need by innovating solutions. Holistic healthcare includes the fields of cardiology, pulmonology, and endocrinology. Remote clinical services such as blood pressure monitors, anticoagulation testing, and ECG devices, are all tools that help measure various health markers. Currently, 28% of consumers use tech to moderate their health. In addition, telehealth programs increase access. This healthcare method can serve high-risk or rural locations, increase healthcare cost parity, help control and diagnose low-incidence diseases, and patients can save up to three hours commuting or around 100 miles of travel.

82% of Americans say telehealth has made it easier to get the care they need. Telehealth has also expanded access to acute care. 59% of Medicare patients have access to a laptop. Telemedicine has also lowered healthcare costs, with savings estimates anywhere from 17% to 75%. Additionally, 22% of the United States population will be older than 65 in 2050, and telemedicine is helping the older population that requires needs-based healthcare.

What are the benefits of telehealth?

The benefits of telehealth are evident. Not only does telehealth increase access, but it also increases patient satisfaction and retention by 81.5%. Telemedicine is more convenient; there is no need to take time off work or commute to the doctor’s office. Many feel less anxiety and fear when seeing a doctor remotely, increasing comfort. In addition, 31% of patients say their healthcare costs decreased when using telehealth.

Patients now expect their doctors to provide telehealth and digital tools. 90% of physicians have experience with remote treatment, and 77% want a shift towards telehealth. Over 75% of patients would consider using telehealth, and 83% of patients expect to use it after 2020. Jonathan Linkous, the CEO of the American Telemedicine Association, states, “In an age where the average consumer manages nearly all aspects of life online, it’s a no-brainer that healthcare should be just as convenient, accessible, and safe as online banking.”

The Booming Business of Telemedicine

The post The Business of Telemedicine is Growing – Here’s Why appeared first on Dumb Little Man.

Masten Space Systems to develop a GPS-like network for the Moon

Masten Space Systems, a startup that’s aiming to send a lander to the Moon in 2023, will develop a lunar navigation and positioning system not unlike GPS here on Earth.

Masten’s prototype is being developed as part of a contract awarded through the Air Force Research Laboratory’s AFWERX program. Once deployed, it’ll be a first-of-its-kind off-world navigational system.

Up until this point, spacecraft heading to the Moon must carry equipment onboard to detect hazards and assist with navigation. To some extent, it makes sense that a shared navigation network has never been established: humans have only landed on the Moon a handful of times, and while there have been many more uncrewed landings, lunar missions still haven’t exactly been a regular occurrence.

But as the costs of going to orbit and beyond have drastically decreased, thanks in part to innovations in launch technology by companies like SpaceX, space is likely to get a lot busier. Many private companies and national space divisions have set their sights on the Moon in particular. Masten is one of them: it was chosen by NASA to deliver commercial and private payloads to a site near the Haworth Crater at the lunar south pole. That mission, originally scheduled for December 2022, was pushed back to November 2023.

Other entities are also looking to go to the Moon. Chief amongst them is NASA with its Artemis program, which will send two astronauts to the Moon’s surface in 2024. These missions will likely only increase in the coming decades, making a common navigation network more of a necessity.

“Unlike Earth, the Moon isn’t equipped with GPS so lunar spacecraft and orbital assets are essentially operating in the dark,” Masten’s VP of research and development Matthew Kuhns explained in a statement.

The system will work like this: spacecraft will deploy position, navigation and timing (PNT) beacons onto the lunar surface. The PNT beacons will enable a surface-based network that broadcasts radio signal, allowing spacecraft and other orbital assets to wireless connect for navigation, timing and location tracking.

The company already concluded Phase I of the project, which involved completing the concept design for the PNT beacons. The bulk of the engineering challenge will come in Phase II, when Masten will develop the PNT beacons. They must be able to withstand harsh lunar conditions, so Masten is partnering with defense and technology company Leidos to build shock-proof beacon enclosures. The aim is to complete the second phase in 2023.

“By establishing a shared navigation network on the Moon, we can lower spacecraft costs by millions of dollars, increase payload capacity, and improve landing accuracy near the most resource-rich sites on the Moon,” Kuhns said.

Stumble-proof robot adapts to challenging terrain in real time

Robots have a hard time improvising, and encountering an unusual surface or obstacle usually means an abrupt stop or hard fall. But researchers at Facebook AI have created a new model for robotic locomotion that adapts in real time to any terrain it encounters, changing its gait on the fly to keep trucking when it hits sand, rocks, stairs, and other sudden changes.

Although robotic movement can be versatile and exact, and robots can “learn” to climb steps, cross broken terrain and so on, these behaviors are more like individual trained skills that the robot switches between. And although robots like Spot famously can spring back from being pushed or kicked, the system is really just working to correct a physical anomaly while pursuing an unchanged policy of walking. There are some adaptive movement models, but some are very specific (for instance this one based on real insect movements) and others take long enough to work that the robot will certainly have fallen by the time they take effect.

Rapid Motor Adaptation, as the team calls it, came from the idea that humans and other animals are able to quickly, effectively, and unconsciously change the way they walk to fit different circumstances.

“Say you learn to walk and for the first time you go to the beach. Your foot sinks in, and to pull it out you have to apply more force. It feels weird, but in a few steps you’ll be walking naturally just as you do on hard ground. What’s the secret there?” asked senior researcher Jitendra Malik, who is affiliated with Facebook AI and UC Berkeley.

Certainly if you’ve never encountered a beach before, but even later in life when you have, you aren’t entering some special “sand mode” that lets you walk on soft surfaces. The way you change your movement happens automatically and without any real understanding of the external environment.

Visualization of the simulation environment. Of course the robot would not perceive any of this visually. Image credit: Berkeley AI Research, Facebook AI Research and CMU

“What’s happening is your body responds to the differing physical conditions by sensing the differing consequences of those conditions on the body itself,” Malik explained — and the RMA system works in similar fashion. “When we walk in new conditions, in a very short time, half a second or less, we have made enough measurements that we are estimating what these conditions are, and we modify the walking policy.”

The system was trained entirely in simulation, in a virtual version of the real world where the robot’s small brain (everything runs locally on the on-board limited compute unit) learned to maximize forward motion with minimum energy and avoid falling by immediately observing and responding to data coming in from its (virtual) joints, accelerometers, and other physical sensors.

To punctuate the total internality of the RMA approach, Malik notes that the robot uses no visual input whatsoever. But people and animals with no vision can walk just fine, so why shouldn’t a robot? But since it’s impossible to estimate the “externalities” such as the exact friction coefficient of the sand or rocks it’s walking on, it simply keeps a close eye on itself.

“We do not learn about sand, we learn about feet sinking,” said co-author Ashish Kumar, also from Berkeley.

Ultimately the system ends up having two parts: a main, always-running algorithm actually controlling the robot’s gait, and an adaptive algorithm running in parallel that monitors changes to the robot’s internal readings. When significant changes are detected, it analyzes them — the legs should be doing this, but they’re doing this, which means the situation is like this — and tells the main model how to adjust itself. From then on the robot only thinks in terms of how to move forward under these new conditions, effectively improvising a specialized gait.

Footage of the robot not falling as it traverses various tough surfaces.

Image Credits: Berkeley AI Research, Facebook AI Research and CMU

After training in simulation, it succeeded handsomely in the real world, as the news release describes it:

The robot was able to walk on sand, mud, hiking trails, tall grass and a dirt pile without a single failure in all our trials. The robot successfully walked down stairs along a hiking trail in 70% of the trials. It successfully navigated a cement pile and a pile of pebbles in 80% of the trials despite never seeing the unstable or sinking ground, obstructive vegetation or stairs during training. It also maintained its height with a high success rate when moving with a 12kg payload that amounted to 100% of its body weight.

You can see examples of many of these situations in videos here or (very briefly) in the gif above.

Malik gave a nod to the research of NYU professor Karen Adolph, whose work has shown how adaptable and freeform the human process of learning how to walk is. The team’s instinct was that if you want a robot that can handle any situation, it has to learn adaptation from scratch, not have a variety of modes to choose from.

Just as you can’t build a smarter computer vision system by exhaustively labeling and documenting every object and interaction (there will always be more), you can’t prepare a robot for a diverse and complex physical world with 10, 100, even thousands of special parameters for walking on gravel, mud, rubble, wet wood, etc. For that matter you may not even want to specify anything at all beyond the general idea of forward motion.

“We don’t pre-program the idea that it has for legs, or anything about the morphology of the robot,” said Kumar.

This means the basis of the system — not the fully trained one, which ultimately did mold itself to quadrupedal gaits — can potentially be applied not just to other legged robots, but entirely different domains of AI and robotics.

“The legs of a robot are similar to the fingers of a hand; the way that legs interact with environments, fingers interact with objects,” noted co-author Deepak Pathak, of Carnegie Mellon University. “The basic idea can be applied to any robot.”

Even further, Malik suggested, the pairing of basic and adaptive algorithms could work for other intelligent systems. Smart homes and municipal systems tend to rely on preexisting policies, but what if they adapted on the fly instead?

For now the team is simply presenting their initial findings in a paper at the Robotics: Science & Systems conference  and acknowledge that there is a great deal of follow-up research to do. For instance building an internal library of the improvised gaits as a sort of “medium term” memory, or using vision to predict the necessity of initiating a new style of locomotion. But the RMA approach seems to be a promising new approach for an enduring challenge in robotics.

Holy Grail raises $2.7M seed fund to create modular carbon capture devices

The founders of Holy Grail, a two-year old startup based in Cupertino, California, are taking a micro approach to solving the outsized problem of capturing carbon.

The startup is prototyping a direct air carbon capture device that it is modular and small — a departure from the dozens of projects in the U.S. and abroad that aim to capture CO2 from large, centralized emitters, like power plants or industrial facilities. Holy Grail co-founder Nuno Pereira told TechCrunch that this approach will reduce costs and eliminate the need for permits or project financing.

While Holy Grail has a long development and testing phase ahead, the idea has captured the attention and capital from well-known investors and Silicon Valley founders. Holy Grail recently raised raised $2.7 million in seed funding from LowerCarbon Capital, Goat Capital, Stripe founder Patrick Collison, Charlie Songhurst, Cruise co-founder Kyle Vogt, Songkick co-founder Ian Hogarth, Starlight Ventures and 35 Ventures. Existing investors Deep Science Ventures, Y Combinator and Oliver Cameron, who co-founded Voyage, the autonomous vehicle acquired by Cruise, also participated.

The carbon capture device is still in the prototype stage, Pereira said, with many specifics – such as the anticipated size of the end product and how long it will likely function – still to be worked out. Cost-effectively separating CO2 from the air is an extremely difficult problem to solve. The company is in the process of filing patents for the technology, so he declined to be too specific about many characteristics of the device, including what it will be made out of. But he did stress that the company is taking a fundamentally different technical approach to carbon capture.

“The current technologies, they are very complex. They are basically either [using] temperature or pressure [to capture carbon],” he said. “There is a lot of things that go into it, compressors, calciners and all these things.” Pereira said the company will instead use electricity to control a chemical reaction that bind to the CO2. He added that Holy Grail’s devices are not dependent on scale to achieve cost reductions, either. And they will be modular, so they can be stacked or configured depending on a customer’s requirements.

The scrubbers, as Pereira calls them, will focus on raw capture of CO2 rather than conversion (converting the CO2 into fuels, for example). Pereira instead explained – with a heavy caveat that much about the end product still needs to be figured out – that once a Holy Grail unit is full, it could be collected by the company, though where the carbon will end up is still an open question.

The company will start by selling carbon credits, using its devices as the carbon reducing project. The end goal is selling the scrubbers to commercial customers and eventually even individual consumers. That’s right: Holy Grail wants you to have your own carbon capture device, possibly even right in your backyard. But the company still likely has a long road ahead of it.

“We’re essentially shifting the scaling factor from building a very large mega-ton plant and having the project management and all that stuff to building scrubbers in an assembly line, like a consumer product to be manufactured.”

Pereira said many approaches will be needed to tackle the mammoth problem of reducing the amount of CO2 in the atmosphere. “The problem is just too big,” he said.

Why a mighty Antarctic glacier is purging ice into the sea

Why a mighty Antarctic glacier is purging ice into the sea

Climate 101 is a Mashable series that answers provoking and salient questions about Earth’s warming climate.  


It’s speeding up.

Antarctica’s Pine Island Glacier, already the biggest source of sea level rise from the ice-clad continent, has started purging more ice than ever observed (from it) into the ocean. 

In research recently published in the journal Science Advances, glacier experts found Pine Island — which holds some 180 trillion tons of ice — lost big chunks of ice into the sea over the past few years (2017-2020), and the glacier picked up its pace. This means Pine Island continues to recede, weaken, and expel more ice into the ocean, with the potential to add much more sea level rise to the planet’s already problematic sea level woes. (Earth’s sea levels have already risen by some eight to nine inches since 1880, and they’re now accelerating.) Read more…

More about Climate Change, Climate 101, Science, and Climate Environment

Hubble snaps a radiant galaxy lit up by a very active black hole

Hubble snaps a radiant galaxy lit up by a very active black hole

The Hubble Space Telescope captured a brilliant image of spiral galaxy NGC 3254, which has a particularly luminescent and active core emitting as much energy as the rest of the galaxy combined.

NGC 3254 is classified as a Seyfert galaxy. Seyfert galaxies are defined by their extremely active centers. Why are they so active? They each have a supermassive black hole in them.

The intense gravitational pull of supermassive black holes whip gases, dusts, stars, planetary bodies, and other material into a frenzy of high-speed collisions as they’re yanked inward. As all that space stuff rips apart, smashes together, and spaghettifies, temperatures soar.  Read more…

More about Hubble Space Telescope, Science, and Space

Space tourism sounds fun. But it could be terrible for the planet.

Space tourism sounds fun. But it could be terrible for the planet.

Space travel has environmental costs. For research, it might be worth it. To send Jeff Bezos, Richard Branson, and other wealthy tourists into orbit? That’s debatable.

Companies including SpaceX, Virgin Galactic, and Space Adventures want to make space tourism more common. And people are interested. 

Japanese billionaire Yusaku Maezawa spent an undisclosed sum of money to SpaceX in 2018 for a private trip around the moon and back. The trip is penciled in for 2023, although the Starship rocket still needs to prove it can reliably take off and land without exploding.

This month, someone paid $28 million to fly on Blue Origin’s New Shepard with the company’s owner, Amazon billionaire Jeff Bezos, and his brother, Mark Bezos. That trip is scheduled for July.  Read more…

More about Nasa, Jeff Bezos, Spacex, Science, and Space

Wooden satellite could be orbiting the Earth by 2021

Wooden satellite could be orbiting the Earth by 2021

A satellite made of wood sounds like a concept ripped from the pages of Leonardo da Vinci’s sketchbooks. But Finnish company Arctic Astronautics wants to launch one into space by the end of 2021. 

The WISA Woodsat, a microsatellite constructed out of plywood, was hoisted via weather balloon Saturday almost 20 miles above Earth. It was in the stratosphere to test the plywood construction in space-like conditions.  

It was armed with a selfie stick to document its journey and a sensor package supplied by the European Space Agency.

A physically unimposing 4-inch cube, Woodsat’s “space plywood,” made by UPM Plywood, is dried out in a vacuum chamber and coated in a thin layer of aluminum oxide to keep it from degrading amid the highly reactive oxygen above the Earth’s atmosphere. Read more…

More about Sustainability, Satellites, Science, Climate Environment, and Space

The fat bear cams are back, baby

The fat bear cams are back, baby

They’re back.

The wildlife streamers explore.org officially turned on the Alaskan brown bear cams on Monday, June 14. The cameras, situated along the salmon-rich Brooks River in Katmai National Park and Preserve, film the internet-famous bears fishing, fighting, sleeping, playing, and beyond throughout the summer and fall.

Bear activity usually ramps up in July, when salmon begin migrating up the river. Here’s what to expect when tuning into the bear cams, which are beamed from a remote, mostly roadless part of Alaska, to people globally:

  1. July: The salmon run up the Brooks River kicks off in early July, and the bears start to congregate at the river to devour fat, 4,500-calorie sockeye salmon. It’s an exciting, phenomenal scene.

  2. August: Often the Brooks River and bear cams quiet down in August, as the bears leave to capitalize on other fishing opportunities (the Brooks River salmon run can dwindle by late July). Though during the big salmon run years of late, many bears still stick around, even in August.

  3. September: The bears, now often filled-out and rotund, return to the Brooks River (and bear cams) in great numbers to feast on dead and dying salmon. The winter looms large. Read more…

More about Fat Bear Week, Bear Cam, Science, and Animals

Why cicadas love to land on you, irresistible human

Why cicadas love to land on you, irresistible human

Even on an Air Force base tarmac, a cicada landed on the president’s neck

The spectacular emergence of the Brood X cicadas — with some 1.5 million of the harmless insects emerging per acre in some places — has resulted in cicadas landing on people’s shirts, arms, hair, and…beyond. But the bugs have no real interest in people: After 17 years of munching on roots underground, the brood emerges to hastily mate and lay eggs. 

So what’s going on? When cicadas emerge, they seek out trees, the places where they often congregate, mate, and ultimately lay eggs on the ends of branches. And to a cicada, trees and people have similarities.  Read more…

More about Animals, Science, and Animals

Hubble catches sight of a beautifully swirling galaxy in flux

Hubble catches sight of a beautifully swirling galaxy in flux

Hubble’s latest look into deep space is a real dazzler.

Hubble catches sight of a beautifully swirling galaxy in flux

Image:  ESA/Hubble & NASA, A. Riess et

Say hello to NGC 4680. This distant galaxy — it’s more than 350 light-years away from Earth — is captured in beautiful detail and color by the Hubble Space Telescope’s Wide Field Camera 3. NASA shared the freshly created image on Friday, highlighting the “neighboring” galaxies visible in the image (one on the right side of the frame and one at the bottom).

Let’s not let those other galaxies steal the show, though. NGC 4680 is an interesting one because of how difficult it is to classify. It looks a bit like a spiral galaxy, and it may have more confidently been one long ago, but as the description points out, it’s often referred to as a lenticular galaxy. Read more…

More about Hubble Space Telescope, Science, and Space

If a scary asteroid will actually strike Earth, here’s how you’ll know

If a scary asteroid will actually strike Earth, here's how you'll know

On April 13, 2029 (which happens to be Friday the 13th), something unsettling will happen.

A decent-sized asteroid, the 1,100-foot-wide Apophis, will pass so close to Earth it’ll be visible in the sky from certain places. Crucially, the giant rock will not strike our humble planet. But it will pass closer than 20,000 miles from the surface, which is closer than where some of the United States’ most prized weather satellites orbit.

Asteroids like Apophis hold a fascinating place in our existence: Big impacts are at once terrible threats to our lives and potentially the habitability for many species, but they’re also extremely rare and irregular events. Yet the internet — awash with clickbait — likes to incessantly warn of incoming threats with misleading headlines like “Asteroid heading our way day before presidential election,” “Should you be worried about the ‘potentially hazardous’ asteroid passing by Earth today?,” and “Massive asteroid will swing by Earth after Valentine’s Day.”  Read more…

More about Space, Nasa, Science, and Space

Feast your eyes on Ganymede, the largest moon in the solar system

Feast your eyes on Ganymede, the largest moon in the solar system

Ganymede, a behemoth of an icy moon in Jupiter’s orbit, was captured in vivid detail by NASA’s Juno spacecraft on June 7. It was the closest flyby of the moon ever. 

Photos shared by NASA Tuesday highlight stunning details of the surface of Ganymede. There are dozens of visible craters, contrasting light and dark terrains, and other large surface features that NASA says could indicate the presence of tectonic faults — the same kind of planetary faults that stretch across the surface of Earth.

The full scope of Ganymede taken by Juno, which reveals a majority of one angle of the moon.

The full scope of Ganymede taken by Juno, which reveals a majority of one angle of the moon.

Image: NASA/JPL-CALTECH/SWRI/MSSS
Read more…

More about Nasa, Solar System, Science, and Space

A SpaceX rocket launched, released a satellite, and landed perfectly overnight

A SpaceX rocket launched, released a satellite, and landed perfectly overnight

While much of the United States was either sleeping or winding down their Saturday evening, SpaceX was keeping busy.

The aerospace company sent one of its Falcon 9 rockets skyward at around 12:30 a.m. ET in an overnight launch that ferried a hefty Sirius XM satellite, dubbed SXM-8, into Earth’s orbit. It was a smooth 125th mission for the company, also its 18th of 2021.

Live webcast of SXM-8 mission → https://t.co/bJFjLCzWdK https://t.co/MesakMwAaY

— SpaceX (@SpaceX) June 6, 2021

It was the third successful launch and landing for the Falcon 9 rocket that did most of the (literal) heavy lifting. Unlike NASA rockets of the past, SpaceX’s Falcon 9 booster is built to be reusable. After ferrying its payload out of Earth’s atmosphere, built-in thrusters flipped the rocket around and reoriented it for the return trip. Read more…

More about Spacex, Science, and Space

Utah drought is so bad, the governor appeals for ‘divine intervention’

Utah drought is so bad, the governor appeals for 'divine intervention'

The megadrought is real

Amid the worst Southwestern drought in at least 400 years — and easily one of the worst sustained droughts in over a millennium — Utah Gov. Spencer Cox has asked the denizens of Utah to pray for rain:

“By praying together and collectively asking God or whatever higher power you believe in for more rain, we may be able to escape the deadliest aspects of the continuing drought,” Gov. Cox said on Thursday. 

The 20-year drying trend in Utah and the greater Southwest is prolonged, and exacerbated by record Utah dryness in 2020, because a relentlessly warming climate is drying out the land. It’s a “hot drought.” This means precipitation trends overall in the Southwest haven’t changed much over the last half-century, but with added heat, more water is now evaporating from rivers, plants, soils, and snowpack. This makes it easier to fall into drought spells, and harder to get out, climate scientists say.  Read more…

More about Climate Change, Science, and Climate Environment

Storm experts will send tough robots directly into hurricanes

Storm experts will send tough robots directly into hurricanes

A seafaring drone can sail where people can’t: straight into a hurricane.

During the 2021 Atlantic hurricane season (it’s predicted to be busy), that’s exactly what scientists will do: send marine robots into the heart of churning cyclones. The unprecedented mission aims to improve researchers’ understanding of how hurricanes rapidly intensify into monstrous storms with destructive winds and deadly flooding. If all goes as planned, the drones will venture through the storm’s most violent winds, which circle the eye of a cyclone, called an eyewall.

“We want to go straight through — we want to go through the eyewall,” said Gregory Foltz, a National Oceanic and Atmospheric Administration (NOAA) oceanographer who’s working on the mission.  Read more…

More about Hurricanes, Science, and Climate Environment

Earthlings, rejoice: We’re visiting Venus again. Twice.

Earthlings, rejoice: We're visiting Venus again. Twice.

Memo to Elon Musk: You were paying attention to the wrong planet all along. Venus, not Mars, is where it’s at. 

That much was confirmed Wednesday, when NASA announced the two winners out of dozens of entries in its $300 million planetary probe contest. VERITAS and DAVINCI+ were chosen as the agency’s next two Discovery missions — and both have our mysterious sister planet in their sights. 

The growing community of Venus exploration boosters inside and outside NASA, which I wrote about last year, had hoped at least one of these missions would be anointed — but dared not dream it would be both. Some planetary scientists have been waiting for this day for decades. Leading luminaries promptly lost their shit.  Read more…

More about Venus, Science, and Space

OroraTech’s space-based early wildfire warnings spark $7M investment

With wildfires becoming an ever more devastating annual phenomenon, it is in the whole planet’s interest to spot them and respond as early as possible — and the best vantage point for that is space. OroraTech is a German startup building a constellation of small satellites to power a global wildfire warning system, and will be using a freshly raised €5.8M (~$7M) A round to kick things off.

Wildfires destroy tens of millions of acres of forest every year, causing immense harm to people and the planet in countless ways. Once they’ve grown to a certain size, they’re near impossible to stop, so the earlier they can be located and worked against, the better.

But these fires can start just about anywhere in a dried out forest hundreds of miles wide, and literally every minute and hour counts — watch towers, helicopter flights, and other frequently used methods may not be fast or exact enough to effectively counteract this increasingly serious threat. Not to mention they’re expensive and often dangerous jobs for those who perform them.

OroraTech’s plan is to use a constellation of about 100 satellites equipped with custom infrared cameras to watch the entire globe (or at least the parts most likely to burst into flame) at once, reporting any fire bigger than ten meters across within half an hour.

Screenshot of OroraTech wildfire monitoring software showing heat detection in a forest.

Image Credits: OroraTech

To start out with, the Bavarian company has used data from over a dozen satellites already in space, in order to prove out the service on the ground. But with this funding round they are set to put their own bird in the air, a shoebox-sized satellite with a custom infrared sensor that will be launched by Spire later this year. Onboard machine learning processing of this imagery simplifies the downstream process.

14 more satellites are planned for launch by 2023, presumably once they’ve kicked the proverbial tires on the first one and come up with the inevitable improvements.

“In order to cover even more regions in the future and to be able to give warning earlier, we aim to launch our own specialized satellite constellation into orbit,” said CEO and co-founder Thomas Grübler in a press release. “We are therefore delighted to have renowned investors on board to support us with capital and technological know-how in implementing our plans.”

Mockup of an OroraTech Earth imaging satellite in space.

Those renowned investors consist of Findus Venture and Ananda Impact Ventures, which led the round, followed by APEX Ventures, BayernKapital, Clemens Kaiser, SpaceTec Capital and Ingo Baumann. The company was spun out of research done by the founders at TUM, which maintains an interest.

“It is absolutely remarkable what they have built up and achieved so far despite limited financial resources and we feel very proud that we are allowed to be part of this inspiring and ambitious NewSpace project,” APEX’s Wolfgang Neubert said, and indeed it’s impressive to have a leading space-based data service with little cash (it raised an undisclosed seed about a year ago) and no satellites.

It’s not the only company doing infrared imagery of the Earth’s surface; SatelliteVu recently raised money to launch its own, much smaller constellation, though it’s focused on monitoring cities and other high-interest areas, not the vast expanse of forests. And ConstellR is aimed (literally) at the farming world, monitoring fields for precision crop management.

With money in its pocket Orora can expand and start providing its improved detection services, though sadly, it likely won’t be upgrading before wildfire season hits the northern hemisphere this year.

How to see a photo NASA’s Hubble telescope took on your birthday

How to see a photo NASA's Hubble telescope took on your birthday

The Hubble Space Telescope is one of the most illuminating and awe-inspiring inventions, capturing the beauty of the universe with its massive lenses as it orbits the Earth day in and day out.

The telescope run jointly by NASA and ESA has been almost constantly in operation since it launched in April 1990, taking in galaxies and nebulae and all sorts of far-away space objects. No matter when your birthday lands on the calendar (except if it’s on Leap Day, sorry), Hubble has an impressive image that it captured on one of those days over the last 31 years.

In 2020, NASA published a page on its site where you can input a month and a date and it serves you a spacey image captured on that day.  Read more…

More about Nasa, Hubble Space Telescope, Science, and Space

Hubble captures a luminous spiral galaxy looking all chill and relaxed

Hubble captures a luminous spiral galaxy looking all chill and relaxed

Swirling around in the Virgo constellation, the Hubble Space Telescope captured a look at the spiral galaxy NGC 5037, NASA shared on Friday.

Located roughly 150 million light-years away from Earth, the galaxy was first documented by astronomer and musical composer William Herschel in 1785. This new, incredibly detailed image of the galaxy is a composite of images taken with Hubble’s Wide Field Camera 3.

The full image shows NGC 5037 encased in a wide, bold border of dark space.

The full image of NGC 5037 reveals a broader look at the galaxy's surroundings, a largely black abyss with a smattering of distant lights.

The full image of NGC 5037 reveals a broader look at the galaxy’s surroundings, a largely black abyss with a smattering of distant lights.

Image: ESA/HUBBLE & NASA, D. ROSARIO, ACKNOWLEDGMENT: L. SHATZ
Read more…

More about Hubble Space Telescope, Science, and Space

NASA Mars rover sends back photos of shimmering, otherworldly clouds

NASA Mars rover sends back photos of shimmering, otherworldly clouds

Mars doesn’t have too many cloudy days, so this new set of images from a NASA-operated rover is a full-on treat.

The Curiosity rover has been gathering data on the Red Planet since it touched down in Aug. 2016. And now, while many space watchers’ eyes are turned toward the Perseverance rover and Ingenuity helicopter that both arrived in February, Curiosity is here to remind us that it’s still putting in plenty of work, too.

So. Back to clouds. They’re not as common on Mars as they are on Earth because the Mars atmosphere is thin and dry, and the clouds that we all see here on Earth are basically just floating water vapor. They do happen on Mars, but it’s usually near the planet’s equator and only during the winter when Mars’ orbit takes it as far from the sun as it ever gets.  Read more…

More about Nasa, Mars, Curiosity Rover, Science, and Space

NASA’s Mars Ingenuity helicopter spirals erratically on sixth flight (but still takes a stunning photo)

NASA's Mars Ingenuity helicopter spirals erratically on sixth flight (but still takes a stunning photo)

We’re happy you’re safe, Ingenuity.

NASA’s Ingenuity Mars helicopter flipped like a coin in the air after venturing out on its sixth flight above the Red Planet. The unexpectedly turbulent flight happened on May 22, but NASA just released information about the “in-flight anomaly” on May 27. 

The helicopter was on a mission to “demonstrate aerial-imaging capabilities,” according to an article published on NASA’s website by Håvard Grip, Ingenuity Mars Helicopter Chief Pilot at NASA’s Jet Propulsion Laboratory.

Ingenuity managed to fly 33 feet above the Martian landscape and snap an image of the planet’s surface (pictured above) before things started spiraling.  Read more…

More about Mars, Science, and Space

Anthropic is the new AI research outfit from OpenAI’s Dario Amodei, and it has $124M to burn

As AI has grown from a menagerie of research projects to include a handful of titanic, industry-powering models like GPT-3, there is a need for the sector to evolve — or so thinks Dario Amodei, former VP of research at OpenAI, who struck out on his own to create a new company a few months ago. Anthropic, as it’s called, was founded with his sister Daniela and its goal is to create “large-scale AI systems that are steerable, interpretable, and robust.”

The challenge the siblings Amodei are tackling is simply that these AI models, while incredibly powerful, are not well understood. GPT-3, which they worked on, is an astonishingly versatile language system that can produce extremely convincing text in practically any style, and on any topic.

But say you had it generate rhyming couplets with Shakespeare and Pope as examples. How does it do it? What is it “thinking”? What knob would you tweak, what dial would you turn, to make it more melancholy, less romantic, or limit its diction and lexicon in specific ways? Certainly there are parameters to change here and there, but really no one knows exactly how this extremely convincing language sausage is being made.

It’s one thing to not know when an AI model is generating poetry, quite another when the model is watching a department store for suspicious behavior, or fetching legal precedents for a judge about to pass down a sentence. Today the general rule is: the more powerful the system, the harder it is to explain its actions. That’s not exactly a good trend.

“Large, general systems of today can have significant benefits, but can also be unpredictable, unreliable, and opaque: our goal is to make progress on these issues,” reads the company’s self-description. “For now, we’re primarily focused on research towards these goals; down the road, we foresee many opportunities for our work to create value commercially and for public benefit.”

Excited to announce what we’ve been working on this year – @AnthropicAI, an AI safety and research company. If you’d like to help us combine safety research with scaling ML models while thinking about societal impacts, check out our careers page https://t.co/TVHA0t7VLc

— Daniela Amodei (@DanielaAmodei) May 28, 2021

The goal seems to be to integrate safety principles into the existing priority system of AI development that generally favors efficiency and power. Like any other industry, it’s easier and more effective to incorporate something from the beginning than to bolt it on at the end. Attempting to make some of the biggest models out there able to be picked apart and understood may be more work than building them in the first place. Anthropic seems to be starting fresh.

“Anthropic’s goal is to make the fundamental research advances that will let us build more capable, general, and reliable AI systems, then deploy these systems in a way that benefits people,” said Dario Amodei, CEO of the new venture, in a short post announcing the company and its $124 million in funding.

That funding, by the way, is as star-studded as you might expect. It was led by Skype co-founder Jaan Tallinn, and included James McClave, Dustin Moskovitz, Eric Schmidt, and the Center for Emerging Risk Research, among others.

The company is a public benefit corporation, and the plan for now, as the limited information on the site suggests, is to remain heads-down on researching these fundamental questions of how to make large models more tractable and interpretable. We can expect more information later this year perhaps as the mission and team coalesces and initial results pan out.

The name, incidentally, seems to derive from the “Anthropic principle,” the notion that intelligent life is possible in the universe because… well, we’re here. Perhaps the idea is that intelligence is inevitable under the right conditions, and the company wants to create those conditions.

Zero-G space fridge could keep astronaut food fresh for years

Regular supply launches keep astronauts aboard the ISS supplied with relatively fresh food, but a flight to Mars won’t get deliveries. If we’re going to visit other planets, we’ll need a fridge that doesn’t break down in space — and Purdue University researchers are hard at work testing one.

You may think there’s nothing to prevent a regular refrigerator from working in space. It sucks heat out and puts cold air in. Simple, right? But refrigerators rely on gravity to distribute oil through the compressor system that regulates temperature, so in space these systems don’t work or break down quickly.

The solution being pursued by Purdue team and partner manufacturer Air Squared is an oil-free version of the traditional fridge that will work regardless of gravity’s direction or magnitude. It was funded by NASA’s SBIR program, which awards money to promising small businesses and experiments in order to inch them towards mission readiness. (The program is currently on its Phase II extended period award.)

In development for two years, the team at last assembled a flight-ready prototype, and last month was finally able to test it in microgravity simulated in a parabolic plane flight.

Initial results are promising: the fridge worked.

“The fact that the refrigeration cycles operated continuously in microgravity during the tests without any apparent problems indicates that our design is a very good start,” said Leon Brendel, a Ph.D student on the team. “Our first impression is that microgravity does not alter the cycle in ways that we were not aware of.”

Short term microgravity (the prototype was only weightless for 20 seconds at a time) is just a limited test, of course, and it already helped shake out an issue with the device that they’re working on. But the next test might be a longer-term installation aboard the ISS, the denizens of which would no doubt like to have a working fridge.

While the prospect of cold drinks and frozen (but not freeze-dried) meals is tantalizing, a normal refrigerator could be used for all kinds of scientific work as well. Experiments that need cold environments currently either use complicated, small scale cooling mechanisms or utilize the near-absolute-zero conditions of space. So it’s no surprise NASA got them aboard the microgravity simulator as part of the Flight Opportunities program.

Analysis of the data collected on the flights is ongoing, but the success of this first big test validates both the approach and execution of the space fridge. Next up is figuring out how it might work in the limited space and continuous microgravity of the ISS.

Why COVID vaccines give way better protection than a COVID infection

Why COVID vaccines give way better protection than a COVID infection

COVID vaccines give us much better protection than a COVID infection, say infectious disease experts.

That’s one of many reasons to get a COVID shot, which are rigorously (and continually) tested for safety. The vaccines trigger a significantly more robust immune response than a naturally-acquired infection. Ultimately, this better prepares your body for a real infection, which can ravage the lungs, among other risks.

“I would advise everyone to get the vaccine,” said Philip Felgner, an infectious disease expert and director of the Vaccine Research and Development Center at the University of California, Irvine. Read more…

More about Covid 19, Science, and Health

What is a super flower blood moon and how can you see it?

What is a super flower blood moon and how can you see it?

On May 26, the only total lunar eclipse of 2021 will be visible from some parts of the world, weather permitting. This particular eclipse happens to have an awesome name — the super flower blood moon — and is a must-see for any moon lovers out there in the right area. 

The super flower blood moon gets its cool name from the color, size, and month of the eclipse. Since the moon will be at its perigee, the closest point to the Earth in its oval-shaped orbit, during the eclipse it will appear to be a “super” or slightly larger moon. “Flower” comes from the Farmer’s Almanac delineation for a full moon that occurs in the month of May. As for “blood,” the atmosphere of the earth might be dusty enough to give the moon a red tint when the eclipse occurs. Put ’em all together and it’s a super flower blood moon.  Read more…

More about Lunar Eclipse, Science, and Space

Millennials are entering a decade of despair. Here’s how they can prepare.

Millennials are entering a decade of despair. Here's how they can prepare.

Turning 40 can be a fraught experience at the best of times. But spare a thought for millennials, the baby-boomer-sized generation that demographers generally define as beginning in 1981. Not only are their elders now reaching the big four-oh in the wake of a global pandemic, with a financial crisis to match the one that hit in their late 20s. But to add insult to injury, a viral Medium post this week described them as “geriatric millennials.” 

That’s not the right term, of course. The older cohort already has a name, “Xennials;” the word “geriatric” usually refers to medical care of the elderly. Still, as the backlash shows, these quintessential 1980s kids are starting to feel their age. And it’s only going to get worse as the 2020s roll on, inducting millions more millennials into the fortysomething club —  without the levels of home ownership and job security that previous generations enjoyed at 40.  Read more…

More about Millennials, Science, Health, and Work Life

Virgin Galactic celebrates its third successful launch, the first ever in New Mexico

Virgin Galactic celebrates its third successful launch, the first ever in New Mexico

Human space flight — and space tourism — took a welcome step forward on Saturday.

That’s because Virgin Galactic made New Mexico the third U.S. state to ever host a space launch. The test flight of the company’s VSS Unity lifted off from Spaceport America in Truth or Consequences, NM a little after 10:30 a.m. ET on Saturday morning.

If you’ve gotten into the habit of tuning in to check out SpaceX launches involving the Falcon 9 launch rocket, it’s worth noting that this one is a little different. The VSS Unity is carried into the sky not by a massive rocket, but by a carrier plane.

This design, dubbed SpaceShip Two, is actually more of a spaceplane. The carrier ship, in this case the VMS Eve, ferries its spacefaring cargo up to the release altitude (44,000 feet). At that point, the Unity detaches from its mothership and ignites its own onboard rocket that takes it the rest of the way into near-orbit. Read more…

More about Virgin Galactic, Science, and Space

A furry little wolf pup’s first howls were caught on camera

A furry little wolf pup's first howls were caught on camera

Dang, what a set of pipes!

The Voyageurs Wolf Project, a University of Minnesota research project that works to observe and learn more about the wolves in Minnesota’s Voyageurs National Park, captured some amazing footage of a tiny wolf pup’s first howls.

The video was captured by a trail camera, and according to the description the pup seen howling was only four weeks old at the time. The tiny wolf is seen walking around in the leaves, practicing some hearty howls, and finally running off into the distance.

Bye, pup. A+ howls, little buddy.

The Voyageurs Wolf Project has also captured remarkable footage by attaching camera collars to wild wolves in the park. That’s how they got this informative video of a wolf catching several fish throughout the course of a day. Read more…

More about Science, Wolves, Culture, and Animals

The 25 best educational podcasts for learning what you missed in school

The 25 best educational podcasts for learning what you missed in school

Most folks love learning, regardless of whether or not school is “their thing.” Sometimes it’s just a matter of finding the right teacher for your learning style—or maybe even the right medium.

For auditory learners, podcasts can be excellent vehicles for processing knowledge that’d be less digestible in more visual mediums like video or even the written word. The American education systems tends to fail students in myriad ways, requiring continual education after the fact to learn the truth behind what we were taught in history, art, science, language, literature, and math. Privileged gatekeepers deciding who and what gets taught can result in the denial of diverse voices and perspectives. Read more…

More about Science, Art, Education, Literature, and Podcasts

See the colossal James Webb Space Telescope unfurl its giant, golden mirrors

See the colossal James Webb Space Telescope unfurl its giant, golden mirrors

Engineers commanded the James Webb Space Telescope to unfurl its over 21-foot wide golden mirrors this week, and the crucial test succeeded.

The test, shown in the image below, was the last check on the sprawling telescope’s moving parts under space-like conditions. James Webb, set to launch in Oct. 2021, will be the largest, most powerful space telescope ever built. It’s the next generation of space telescope, furthering the legendary observations made by the aging, over 30-year old Hubble telescope

After launching into space, Webb’s 18 hexagonal mirrors will align to form a “single, precise mirror,” NASA engineer Lee Feinberg said in a statement. “The primary mirror is a technological marvel,” Feinberg said. Read more…

More about Nasa, Science, and Space

Behold this giant moth

Behold this giant moth

Mothra is real, and she is Australian. 

Construction workers at an Australian primary school (i.e. elementary school) found a huge bug this week, and have been proudly showing it off as per primary school tradition. To be fair, it is a very cool bug.

The Giant Wood Moth was discovered by builders working on Mount Cotton State School’s new classrooms, which are on the edge of a rainforest in south-east Queensland. Giant Wood Moths are common along the Queensland coast, but actually spotting one in the wild is a rarer phenomenon.

“The staff and students weren’t surprised by the find as we have a range of animals on our grounds at Mount Cotton State School such as bush turkeys, wallabies, koalas, ducks, the occasional snake that needs to be relocated back to our rainforest, echidnas, tree frogs, possums, chickens, and turtles,” said principal Meagan Steward, apparently unaware she is actually running a wildlife park with regular child visitors. “But a Giant Wood Moth was not something we had seen before.” Read more…

More about Australia, Moths, Science, and Animals

Constantly stressed at work? It might actually be changing your personality.

Constantly stressed at work? It might actually be changing your personality.

If you’re worried your bad workplace is making you a worse person, you may be right. Researchers from the University of Illinois recently introduced a new model examining how chronic workplace stress can fundamentally change people’s personalities — and predictably, it isn’t for the better. 

According to the researchers, previous studies on workplace behaviour largely operated on the premise that personalities are fixed. Hire someone kind, they’ll make the workplace kinder. Hire a jerk, they’ll bring jerk energy to the role. It’s all fairly logical. 

However, in a new paper published in Journal of Management, organisational researchers Jarvis Smallfield and Donald H. Kluemper consider that workplace stress can actually alter people’s personalities in both the short and long term. This impact is examined through the Big Five model of personality traits: conscientiousness, agreeableness, neuroticism, openness, and extroversion.  Read more…

More about Work, Stress, Science, and Health

Watch a monkey equipped with Elon Musk’s Neuralink device play Pong with its brain

Elon Musk’s Neuralink, one of his many companies and the only one currently focused on mind control (that we’re aware of), has released a new blog post and video detailing some of its recent updates — including using its hardware to make it possible for a monkey to play pong with only its brain.

In the video above, Neuralink demonstrates how it used its sensor hardware and brain implant to record a baseline of activity from this macaque (named ‘Pager’) as it played a game on-screen where it had to move a token to different squares using a joystick with its hand. Using that baseline data, Neuralink was able to use machine learning to anticipate where Pager was going to be moving the physical controller, and was eventually able to predict it accurately before the move was actually made. Researchers then removed the paddle entirely, and eventually did the same thing with Pong, ultimately ending up at a place where Pager no longer was even moving its hand on the air on the nonexistent paddle, and was instead controlling the in-game action entirely with its mind via the Link hardware and embedded neural threads.

The last we saw of Neuralink, Musk himself was demonstrating the Link tech live in August 2020, using pigs to show how it was able to read signals from the brain depending on different stimuli. This new demo with Pager more clearly outlines the direction that the tech is headed in terms of human applications, since, as the company shared on its blog, the same technology could be used to help patients with paralysis manipulate a cursor on a computer, for instance. That could be applied to other paradigms as well, including touch controls on an iPhone, and even typing using a virtual keyboard, according to the company.

Musk separately tweeted that in fact, he expects the initial version of Neuralink’s product to be able to allow someone with paralysis that prevents standard modes of phone interaction to use one faster than people using their thumbs for input. He also added that future iterations of the product would be able to enable communication between Neuralinks in different parts of a patient’s body, transmitting between an in-brain node and neural pathways in legs, for instance, making it possible for “paraplegics to walk again.”

These are obviously bold claims, but the company cites a lot of existing research that undergirds its existing demonstrations and near-term goals. Musk’s more ambitious claims, should, like all of his projections, definitely be taken with a healthy dose of skepticism. He did add that he hopes human trials will begin to get underway “hopefully later this year,” for instance – which is already two years later than he was initially anticipating those might start.

FDA authorizes Moderna’s COVID-19 vaccine for emergency use

The U.S. Food and Drug Administration (FDA) has issued an Emergency Use Authorization (EUA) for Moderna’s COVID-19 vaccine, as expected after an independent panel commissioned by the administration recommended its approval earlier this week. This is the second vaccine now authorized for use in the U.S. under EUA, after the Pfizer -BioNTech vaccine was approved last week.

Moderna’s vaccine could begin being administered to Americans by “Monday or Tuesday” next week, according to Dr. Anthony Fauci speaking to NBC’s Today show in a new interview. That’s in keeping with the timelines between the Pfizer EUA and the first patients actually receiving the vaccine last week.

Like Pfizer’s vaccine, Moderna’s is an mRNA therapy. That means that it contains no actual virus — just genetic instructions that tell a person’s body to create a specific protein. That protein is more or less identical to the one that SARS-CoV-2, the virus which causes COVID-19, uses to attach to a host’s cells and replicate. Moderna’s vaccine causes a person to create just the protein, which on its own is harmless, and then their natural defenses via their immune system react to that and develop a method for fighting it off. That defense system is “remembered” by the body, while the vaccine itself naturally dissolves after a brief time, leaving a person with immunity but nothing else.

The Oxford-AstraZeneca vaccine, which has yet to be approved for use in the U.S., uses a weakened and modified common cold virus that doesn’t spread in humans to create the spike protein in recipients, resulting in the body generating its own immune response. That’s a much more tried-and-tested method for creating a vaccine, but both Moderna and Pfizer’s mRNA therapies have shown to be very effective in preliminary data from their large Phase 3 clinical trials.

FDA grants emergency use authorization for Pfizer’s COVID-19 vaccine, distribution to begin within days

The U.S. Food and Drug Administration (FDA) has granted an Emergency Use Authorization (EUA) for the COVID-19 vaccine developed by Pfizer and its partner BioNTech, the New York Times first reported on Friday night, and later supported by The Wall Street Journal. This EUA follows a recommendation by an independent panel of experts commissioned by the FDA to review Pfizer’s application and provide a recommendation, which the panel unanimously supported earlier this week.

Following this authorization, shipment of the vaccine are expected to begin immediately, with 2.9 million doses in the initial shipment order. Patients in the category of highly vulnerable individuals, which include healthcare workers and senior citizens in long-term care facilities, are expected to begin receiving doses within just a few days not was the EUA is granted.

This approval isn’t a full certification by the U.S. therapeutics regulator, but it is an emergency measure that still requires a comprehensive review of the available information supplied by Pfizer based on its Phase 3 clinical trial, which covered a group of 44,000 volunteer participants. Pfizer found that its vaccine, which is an mRNA-based treatment, was 95% effective in its final analysis of the data resulting form the trial to date – and also found that safety data indicated no significant safety issues in patients who received the vaccine.

On top of the initial 2.9 million dose order, the U.S. intends to distribute around 25 million doses by the end of 2020, which could result in far fewer people actually vaccinated since the Pfizer course requires two innoculations for maximum efficacy. Most American shouldn’t expect the vaccine to be available until at least late Q1 or Q2 2021, given the pace of Pfizer’s production and the U.S. order volume.

Still, this is a promising first step, and a monumental achievement in terms of vaccine development turnaround time, since it’s been roughly eight months since work began on the Pfizer vaccine candidate. Moderna has also submitted an EUA for its vaccine candidate, which is also an mRNA treatment (which provides instructions to a person’s cells to produce effective countermeasures to the virus). That could follow shortly, meaning two vaccines might be available under EUA within the U.S. before the end of the year.

Facebook launches climate change information center

Facebook launches climate change information center

Facebook has launched a new resource to help in the fight against climate change. 

On Tuesday, ahead of the annual Climate Week summit in New York City, the company announced the launch of its Climate Science Information Center, a special Facebook Page that offers information and resources on climate change. 

The company says the new climate info center will provide information on climate change from the world’s leading climate organizations, with Facebook News curators publishing posts from “quality publishers and other relevant sources.” The page will also provide viewers with steps they can take to prevent climate change.  Read more…

More about Facebook, Climate Change, Science, and Climate Environment

Astronauts successfully depart the ISS aboard SpaceX Dragon, starting their trip home

NASA astronauts Bob Behnken and Doug Hurley have successfully undocked from the International Space Station, which is the first crucial stage of their return to Earth. Next, they’ll travel on a coast phase that will take them on a descent course back through the atmosphere from space, shedding speed as they prepare to deploy the parachutes of the SpaceX Crew Dragon spacecraft and drop into the Atlantic Ocean for recovery.

The undocking, coast and splashdown phase are all meant to be performed entirely via automation, with the control systems SpaceX designed for Crew Dragon managing the entire process, including burns to control the capsule’s travel away from the Station and its controlled descent through the atmosphere. While re-entering the atmosphere, the Dragon will undergo tremendous stress, and its angle of descent is intended to slow its velocity to the point where it can safely deploy those parachutes to slow its fall even further, all the while keeping Behnken and Hurley safe.

The coast phase will take many hours, with SpaceX and NASA expecting the eventual splashdown of the capsule happening sometime around 2:42 PM EDT (11:42 PM PDT) tomorrow, Sunday August 2.

This is the final phase of SpaceX’s Demo-2 mission from its Commercial Crew program with NASA, which is the qualification program that the agency requires to certify Crew Dragon for regular operational missions taking astronauts to and from the station. Behnken and Hurley launched on the first part of this historic mission, which is the first to see humans fly aboard a SpaceX spacecraft, on May 30, and have spent the intervening months on the Space Station contributing to regular crew missions.

Crew Dragon will splash down off the coast of Florida to conclude Demo-2, and SpaceX crews are on hand to recover the astronauts at that point and bring them the rest of the way back to terra firma. If everything goes to plan, then SpaceX will officially be ready to begin standard astronaut flights, as mentioned – and the first of those is planned for sometime in late September, so they won’t have to wait long.

We’ll have updates for the remainder of this final leg as they become available, so stay tuned.

SpaceX’s astronaut launch marks the dawn of the commercial human spaceflight industry

SpaceX on Saturday launched two NASA astronauts aboard its Crew Dragon spacecraft, and the accomplishment is a tremendous one for both the company and the U.S. space agency. At a fundamental level, it means that the U.S. will have continued access to the International Space Station, without having to rely on continuing to buy tickets aboard a Russian Soyuz spacecraft to do so. But it also means the beginning of a new era for the commercial space industry – one in which private companies and individual buying tickets for passenger trips to space is a consistent and active reality.

With this mission, SpaceX will complete the final step required by NASA to human-rate its Falcon 9 and Crew Dragon spacecraft, which means that it can begin operationally transporting people from Earth essentially as soon as this mission concludes (Crew Dragon still has to rendezvous with the space station tomorrow, and make its way back to Earth with astronauts on board in a few weeks). Already, SpaceX has signed an agreement with Space Adventures, a private space tourism booking company that has previously worked with Roscosmos on sending private astronauts to orbit.

SpaceX wants to start sending up paying tourists on orbital flights (without any ISS stops) starting as early as next year aboard Crew Dragon. The capsule actually supports up to seven passengers per flight, though only four seats will ever be used for official NASA crew delivery missions for the space station. SpaceX hasn’t released pricing on private trips aboard the aircraft, but you can bet they’ll be expensive since a Falcon 9 launch (without a human rated capsule) costs around $60 million, and so even dividing that by seven works out to a high price of entry.

So this isn’t the beginning of the era of accessible private spaceflight, but SpaceX is the first private company to actually put people into space, despite a lot of talk and preparatory work by competitors like Virgin Galactic and Blue Origin. And just like in the private launch business, crossing the gulf between having a private company that talks about doing something, and a company that actually does it, will absolutely transform the space industry all over again.

Here’s how.

Tourism

SpaceX is gearing up to launch tourists as early as next year, as mentioned, and while those tourists will have to be deep-pocketed, as eight everything that SpaceX does, the goal is to continue to find ways to make more aspects of the launch system reusable and reduce costs of launch in order to bring prices down.

Even without driving down costs, SpaceX will have a market, however niche, and one that hasn’t yet really had any inventory to satisfy demand. Space Adventures has flown a few individuals by buying tickets on Soyuz launches, but that hasn’t really been a consistent or sustainable source of commercial human spaceflight, and SpaceX’s system will likely have active support and participation from NASA.

That’s an entirely new revenue stream for SpaceX to add to its commercial cargo launches, along with its eventual launch of commercial internet service via Starlink. It’s hard to say yet what kind of impact that will actually have on their bottom line, but it could be big enough to have an impact – especially if they can figure out creative ways to defray costs over successive years, since each cut will likely considerably expand their small addressable audience.

SpaceX’s impact on the launch business was to effectively create a market for small satellites and more affordable orbital payloads that simply didn’t make any economic sense with larger existing launch craft, most of which were bankrolled almost entirely by and for defence and NASA use. Similarly, it’s hard to predict what the space tourism market will look like in five years, now that a company is actually offering it and flying a human-rated private spacecraft that can make it happen.

Research

Private spacefarers won’t all be tourists – in fact, it could make a lot more financial sense for the majority of passengers to and from orbit to be private scientists and researchers. Basically, imagine a NASA astronaut, but working for a private company rather than a publicly-funded agency.

Astronauts are essentially multidisciplinary scientists, and the bulk of their job is conducing experiments on the ISS. NASA is very eager to expand commercial use of the ISS, and also to eventually replace the aging space station with a private one of which they’re just one of multiple customers. Already, the ISS hosts commercial experiments and cargo, but if companies and institutions can now also send their own researchers as well, that may change considerably how much interest their is in doing work on orbit, especially in areas like biotech where the advantages of low gravity can produce results not possible on Earth.

Cost is a gain a significant limiting factor here, since the price per seat will be – no pun intended – astronomical. But for big pharma and other large companies who already spend a considerable amount on R&D it might actually be within reach. Especially in industries like additive manufacturing, where orbit is an area of immense interest, private space-based labs with actual rotating staff might not be that farfetched an idea.

Marketing & Entertainment

Commercial human spaceflight might actually be a great opportunity to make actual commercials – brands trying to outdo each other by shooting the first promo in space definitely seems like a likely outcome for a Superbowl spot. It’s probably not anyone’s priority just now, given the ongoing global pandemic, but companies have already discussed the potential of marketing partnerships as a key driver of real revenue, including lunar lander startup ispace, which has signed a number of brand partners to fund the build and flight of its hardware.

Single person rides to orbit are definitely within budget for the most extreme marketing efforts out there, and especially early on, there should be plenty of return on that investment just because of how audacious and unique the move is. The novelty will likely wear off, but access to space will remain rarified enough for the forseeable future that it could still be part of more than a few marketing campaigns.

As for entertainment, we’ve already seen the first evidence of interest there – Tom Cruise is working on a project to be filmed at least in part in space, apparently on board the International Space Station. SpaceX is said to be involved in those talks, and it would make a lot of sense for the company to consider a Crew Dragon flight with film crew and actors on board for both shooting, and for transportation to ‘on location’ shoots on the ISS.

Cruise probably isn’t the only one to consider the impact of a space-based motion picture project, and you can bet at least one reality show producer somewhere is already pitching ‘The Bachelor’ in space. Again, it’s not going to be within budget for every new sci-fi project that spins up, but it’s within blockbuster budget range, and that’s another market that grew by 100% just by virtue of the fact that it didn’t exist as a possibility before today.

Novel industry

It’s hard to fully appreciate what kind of impact this will have, because SpaceX has literally taken something that previously wasn’t possible, and made it available – at costs that, while high, aren’t so high as to be absurd. As with every other such expansion, it will likely create new and innovative opportunities that haven’t even been conceived, especially once the economics and availability of flights, etc. are clarified. GPS, another great space-based innovation, formed the bedrock of an industry that changed just about every aspect of human life – private commercial spaceflight could do the same.

5 Compelling Reasons To Use Compression Garments For Running

Weight loss, better sleep, strong bones – these are just some of the health benefits of running. But when it comes to such exercise, the clothes you wear can also impact its quality. Workout clothes are not made the same way so they can make or break your performance. For this reason, investing in a pair of high-quality compression tights is an excellent upgrade to your present running gear.

Compression tights are usually made of nylon and spandex and are designed to snugly hug your body and legs. These unique features offer several benefits for running. Here are some benefits that should convince you to add compression pants to your running arsenal.

Facilitate Muscle Recovery

benefits of compression pants for running

As in any sport or physical activity, it is important for you to recover quickly from your workout. Sore and aching muscles can interfere with your daily life. If your recovery rate takes too long, you may not be able to perform at optimal levels and make progress on your next run.

Aside from taking time to do warm-up stretches before starting your exercise, you can improve your muscle recovery by wearing compression leggings.

A study conducted by William Kramer, professor of kinesiology in the Neag School of Education, compared one group of women who wore compression garment for five days with another group, which did not wear any compression garment. The group wearing compression clothing was found to have better recovery of muscle strength and power. Furthermore, there was significantly less-perceived muscle soreness in the first group compared to the group not wearing compression garments.

Shield from the Elements

Serious runners don’t let something like the weather stop them from going for a run. For this reason, it is important to invest in good-quality running gear that will give you protection from all kinds of things, from inclement weather to insects.

Compression tights have an added feature of giving your legs an extra layer of protection. Scorching heat, cold winds, and thorny bushes can make your run uncomfortable and even dangerous. Wearing compression leggings gives you full coverage from these elements.

If it is particularly cold, make sure to wear compression tights that are designed for cold climates and with long underwear underneath them. Include other winter running gear, like a winter beanie and gloves to protect your head and hands from the cold.

Improve Circulation

benefit of compression clothing for running

Have you ever run while wearing ill-fitting workout pants or leggings in a fabric that is uncomfortably tight and thick? If you have, then you know how much it can interfere with your movement and general performance. You’ll suffer soreness and even chafing during and after the exercise.

Compression tights are designed to glide on your legs and body like a second skin. This design increases circulation in your legs, so it makes running easier and more comfortable. In addition to providing comfort, compression tights can improve your stamina and speed because it decreases energy expenditure. You focus more on your performance and progress and avoid getting distracted by pain caused by poor circulation.

Do note that some compression tights are made specifically for medical purposes. Take for example those that provide support to pregnant women or to people whose jobs involve standing for hours. These garments may help prevent disorders caused by poor circulation, like varicose veins.

Versatile Design

Different runners have different needs. For these various runs, you’ll need specific running gear to improve your performance. Whether you are into long marathons, short sprints, or recovery runs, there are compression pants that will meet your requirements and help you be at your best.

Some compression leggings are designed specifically for muscle support. This is important for long runs and for runners who are prone to aching and sore muscles. Compression pants with muscle support are also a smart choice if you are doing a recovery run.

There are also compression tights that feature more knee support if you have problematic knees. If you live in an area with particularly cold winters, go for thermal-compression pants, which are designed to keep you warm.

On the other hand, if you live in a humid and rainy location, you can pick compression pants that are lightweight and moisture-wicking, so they dry fast and don’t make you feel dehydrated and too hot while on the move.

Boost Movement

Speed is one of the things you want to improve on when you run, and wearing compression tights can help you constantly break your old record. You may have an ideal weather in mind when you run. Maybe you love running when it is only slightly sunny with no strong winds or rain, but you know you can’t control the weather. You still need to hit the pavement even when it’s cold and windy or even when it’s raining.

Compression tights can help you run faster even in these conditions because they are designed to glide smoothly and snugly on your legs. As a result, there’s less drag compared to wearing pants in a thicker and heavier fabric. Compression pants are also contoured so they reduce wind resistance and will not hamper your stride and movements.

The post 5 Compelling Reasons To Use Compression Garments For Running appeared first on Dumb Little Man.

The ‘PuffPacket’ could help researchers learn when, how and why people vape

Vaping is a controversial habit: it certainly has its downsides, but anecdotally it’s a fantastic smoking cessation aid. The thing is, until behavioral scientists know a bit more about who does it, when, how much and other details, its use will continue to be something of a mystery. That’s where the PuffPacket comes in.

Designed by Cornell engineers, the PuffPacket is a small device that attaches to e-cigarettes (or vape pens, or whatever you call yours) and precisely measures their use, sharing that information with a smartphone app for the user, and potentially researchers, to review later.

Some vaping devices are already set up with something like this, to tell a user when the cartridge is running low or a certain limit has been reached. But generally when vaping habits are studied, they rely on self-report data, not proprietary apps.

“The lack of continuous and objective understanding of vaping behaviors led us to develop PuffPacket to enable proper measurement, monitoring, tracking and recording of e-cigarette use, as opposed to inferring it from location and activity data, or self-reports,” said PhD student Alexander Adams, who led the creation of the device, in a Cornell news release.

The device fits a number of e-cigarette types, fitting between the mouthpiece and the heating element. It sits idle until the user breathes in, which activates the e-cigarette’s circuits, and the PuffPacket’s as well. By paying attention to the voltage, it can tell how much liquid is being vaporized, as well as simpler measurements like the duration and timing of the inhalation.

An example using real data of how location and activity could be correlated with vaping

This data is sent to the smartphone app via Bluetooth, where it is cross-referenced with other information, like location, motion and other metadata. This may lead to identifiable patterns, like that someone vapes frequently when they walk in the morning but not the afternoon, or after coffee but not meals, or far more at the bar than at home — that sort of thing. Perhaps even (with the proper permissions) it could track use of certain apps — Instagram and vape? Post-game puff?

Some of these might be obvious, others not so much — but either way, it helps to have them backed up by real data rather than asking a person to estimate their own usage. They may not know, understand or wish to admit their own habits.

“Getting these correlations between time of day, place and activity is important for understanding addiction. Research has shown that if you can keep people away from the paths of their normal habits, it can disrupt them,” said Adams.

No one is expecting people to voluntarily stick these things on their vape pens and share their info, but the design — which is being released as open source — could be used by researchers performing more formal studies. You can read the paper describing PuffPacket here.

A new FDA-authorized COVID-19 test doesn’t need a lab and can produce results in just 5 minutes

There’s a new COVID-19 test from healthcare technology maker Abbott that looks to be the fastest yet in terms of producing results, and that can do so on the spot right at point-of-care, without requiring a round trip to a lab. This test for the novel coronavirus causing the current global pandemic has received emergency clearance for use by the U.S. Food and Drug Administration, and will begin production next week, with output of 50,000 per day possible starting next week.

The new Abbott ID NOW COVID-19 test uses the Abbott ID NOW diagnostics platform, which is essentially a lab-in-a-box that is roughly the size of a small kitchen appliance. It’s size, and the fact that it can produce either a positive result in just five minutes, or a negative one in under 15, mean that it could be a very useful means to extend coronavirus testing beyond its current availability to more places including clinics and doctor’s offices, and cut down on wait times both in terms of getting tested and receiving a diagnosis.

Unlike the rapid tests that have been used in other countries, and that received a new type of authorization under an FDA guideline that doesn’t confirm the accuracy fo the results, this rapid testing solution uses the molecular testing method, which works with saliva and mucus samples swabbed from a patient. This means that it works by identifying a portion of the virus’ DNA in a patient, which means it’s much better at detecting the actual presence of the virus during infection, whereas other tests that search the blood for antibodies that are used in point-of-care settings can only detect antibodies, which might be present in recovered patients who don’t actively have the virus.

The good news for availability of this test is that ID NOW, the hardware from Abbott that it runs on, already “holds the largest molecular point-of-care footprint in the U.S.,” and is “widely available” across doctor’s offices, urgent care clinics, emergency rooms and other medical facilities.

In total, Abbott now says that it believes it will produce 5 million tests in April, split between these new rapid tests and the lab tests that it received emergency use authorization for by the FDA on March 18.

Testing has been one of the early problems faced by the U.S. in terms of getting a handle on the coronavirus pandemic: The country has lagged behind other nations globally in terms of per capita tests conducted, which experts say has hampered its ability to properly track and trace the spread of the virus and its resulting respiratory disease. Patients have reported having to go to extreme lengths to receive a test, and endure long waits for results, even in cases where exposure was likely and their symptoms match the COVID-19 profile.

YC startup Felix wants to replace antibiotics with programmable viruses

Right now the world is at war. But this is no ordinary war. It’s a fight with an organism so small we can only detect it through use of a microscope — and if we don’t stop it, it could kill millions of us in the next several decades. No, I’m not talking about COVID-19, though that organism is the one on everyone’s mind right now. I’m talking about antibiotic-resistant bacteria.

You see, more than 700,000 people died globally from bacterial infections last year — 35,000 of them in the U.S. If we do nothing, that number could grow to 10 million annually by 2050, according to a United Nations report.

The problem? Antibiotic overuse at the doctor’s office or in livestock and farming practices. We used a lot of drugs over time to kill off all the bad bacteria — but it only killed off most, not all, of the bad bacteria. And, as the famous line from Jeff Goldblum in Jurassic Park goes, “life finds a way.”

Enter Felix, a biotech startup in the latest Y Combinator batch that thinks it has a novel approach to keeping bacterial infections at bay – viruses.

Phage killing bacteria in a petri dish

It seems weird in a time of widespread concern over the corona virus to be looking at any virus in a good light but as co-founder Robert McBride explains it, Felix’s key technology allows him to target his virus to specific sites on bacteria. This not only kills off the bad bacteria but can also halt its ability to evolve and once more become resistant.

But the idea to use a virus to kill off bacteria is not necessarily new. Bacteriophages, or viruses that can “infect” bacteria, were first discovered by an English researcher in 1915 and commercialized phage therapy began in the U.S. in the 1940’s through Eli Lilly and Company. Right about then antibiotics came along and Western scientists just never seemed to explore the therapy further.

However, with too few new solutions being offered and the standard drug model not working effectively to combat the situation, McBride believes his company can put phage therapy back at the forefront.

Already Felix has tested its solution on an initial group of 10 people to demonstrate its approach.

Felix researcher helping cystic fibrosis patient Ella Balasa through phage therapy

“We can develop therapies in less time and for less money than traditional antibiotics because we are targeting orphan indications and we already know our therapy can work in humans,” McBride told TechCrunch . “We argue that our approach, which re-sensitizes bacteria to traditional antibiotics could be a first line therapy.”

Felix plans to deploy its treatment for bacterial infections in those suffering from cystic fibrosis first as these patients tend to require a near constant stream of antibiotics to combat lung infections.

The next step will be to conduct a small clinical trial involving 30 people, then, as the scientific research and development model tends to go, a larger human trial before seeking FDA approval. But McBride hopes his viral solution will prove itself out in time to help the coming onslaught of antibiotic resistance.

“We know the antibiotic resistant challenge is large now and is only going to get worse,” McBride said. “We have an elegant technological solution to this challenge and we know our treatment can work. We want to contribute to a future in which these infections do not kill more than 10 million people a year, a future we can get excited about.”

The People and Tech Behind Data Science Explained

Within the past few years, the study of data science and machine learning has exploded into its own job field. However, the tech subgenre has been galavanting to the mainstream for nearly 3 centuries. It all started sometime in the 1740s with Bayes’ Theorem.

Today, the demand for data scientists is at its peak and is only continuing to surge. By the end of this year, there will have been 2.7 million data scientist job openings. The million-dollar question is:

What do data scientists do?

To be direct, data science is the process of analyzing data. Explaining with more complexity, data scientists use heterogeneous data – data that were composed of different forms or dissimilar components – to solve rigorous problems.

To do so, data scientists use their master skills in computer science, high-level mathematics, and more. These aforementioned skills are especially unique to data scientists developing industrial machine learning technologies, programs, and Enterprise AI.

data scientists do

In short, data scientists take three steps in analyzing their data: preparing, testing, and capturing. Of course, 2-3 individual tasks are necessary to complete each step.

Preparing

To prepare for data analysis, data must be first captured. In other words, the first task in a data scientist’s work is to simply collect their data. Scientists can extract or acquire them. To further prepare for data analysis, the scientist then maintains their data, safely storing and staging it.

Some data storage methods include scalable storage optimized with artificial intelligence. Finally, the data undergo processing which involves mining and classification.

Testing

Once data scientists are done with preparation, having completed the previously mentioned tasks, they will move on to physically testing their findings. Think back to your public school science fair days:

What must you do before testing your data?

Draft a hypothesis or an educated guess. It involves developing a theory to test with the data model. After this, the data is finally ready for analysis. This is the stage where new findings based on the initially collected data are discovered. It’s often done by modeling, exploring, and experimenting with data to reach desired outcomes and to decide what the data means.

what data scientists do

Capturing

All that’s left to do at this point is to communicate the results. Reconnecting data science to science fair procedures, how would you express your final discoveries? Perhaps a visual aid, such as a tri-fold poster board.

While data scientists may not necessarily take this route of expression, they create an easily understood picture of the model’s predictions. It’s easily translatable to an audience of laymen. The final task in data analysis is to apply your results. It’s to help end-users understand how to use the predictions to take effective actions within their business.

See Also: The Importance of Big Data In Business And Your Success

Who are data scientists?

More importantly, how do you become one?

Jenn Gamble, Director of Noodle.ai – the leading machine learning software giant – spoke on the subject, saying, “You don’t necessarily need a Ph.D. to do data science – you need an aptitude for math and a creative, problem-solving mentality.”

By 2025, we will be creating data worth 175 billion terabytes on a daily basis, so the primary way to fully understand and analyze the world’s surging data is to hire more data scientists with access to advanced tools.

Some of the most popular tools in the industry include the R programming language, python, PyTorch, hadoop, and Apache Spark. Among the most crucial roles needing fulfillment in the machine learning job economy are data engineers, AI hardware specialists, and software engineers.

Data engineers create and maintain the methods which bring in data, needing skills in Scikit-learn, AForge.NET, and/or Java programming language. Software engineers analyze business data and design software to fit needs, needing skills in Java programming language, SQL, and/or python. Lastly, AI hardware specialists create and program AI to perform specific tasks, needing skills in machine learning, python, Saas, and Java programming language.

Data science provides opportunities for people to express their creativity.

It gives them the means to create technology that can initiate changes worldwide. Think about all of the space exploration, autonomous vehicles, personalized medicine, and personalized education that have been created within the past few years. They are the work of data scientists.

This isn’t all, however.

Data scientists have also created technologies capable of monitoring wildlife migration and optimize energy. The essentiality of data science is no question. In fact, between just 2011 and 2012, job listings for “data scientist” increased by 15,000%.

Find out more about what data science is below.

The post The People and Tech Behind Data Science Explained appeared first on Dumb Little Man.

Treating Arthritis Naturally: Home Remedies For Arthritis and Joint Pain

Arthritis is one of the most chronic and common illnesses affecting millions around the globe. It is a disease that causes inflammation of the joints and considered a rheumatic condition. It doesn’t just harm the joints, but sometimes it can also affect the ligaments, tendons, muscles, and bones.

If the condition spreads, it can also affect other internal organs and the immune system. The symptoms of arthritis include severe ache, pain, stiffness, and swelling in bones and muscles. Arthritis can also drain a person mentally, causing sleep disorders and depression.

Arthritis can cause severe fatigue and pain which can make daily tasks appear daunting.

What causes arthritis?

There are five primary types of arthritis: Osteoarthritis, Rheumatoid Arthritis, Psoriatic Arthritis, and Gout. Each type of arthritis has its own set of causes.

Below are some of the primary causes of arthritis:

  • Injury
  • Infections
  • Immune disorders
  • Heredity
  • Abnormal metabolism
  • Muscle disorders
  • Age
  • Obesity

How to treat arthritis?

The treatment for arthritis involves enriching the bones, joints or muscles, controlling the pain, and alleviating the symptoms. The treatment or care also focuses on minimizing the damage on the joints or nearby muscles and organs.

Home and natural remedies for arthritis involves augmenting the dietary and lifestyle habits. It can help in treating and also preventing the further spread of arthritis or its conditions.

Natural remedies for arthritis

Ginger

treating arthritis naturally ginger

Ginger is favored around the globe for its rich medical properties. It contains anti-inflammatory and antioxidant properties. These and other components of ginger work together in reducing pain, inflammation, and disability.

Ginger also promotes weight loss and boosts immunity which improves the body’s ability to alleviate arthritis. It plays another critical role by increasing blood circulation and supporting the affected area with its healing properties.

Use powdered ginger to add flavors to dishes. Add grated ginger root to soups, teas or juices to experience its benefits.

Broccoli

treating arthritis naturally broccoli

Broccoli is a superfood that reduces inflammation. It is rich in calcium, potassium, and magnesium which are known to be highly beneficial in treating arthritis.

See Also: A List of Superfoods You Definitely Need in Your Diet

Adding broccoli to salads is the most common practice to take in their benefits.

Healthy dietary management to fight arthritis.

Below are some of the foods for arthritis patients:

Sulfur rich foods: Asparagus, cabbage, garlic, and onions.
Antioxidant foods: Brussel sprouts, broccoli, carrots, spinach, red & green peppers.
Vitamin-rich fruits: Avocados, berries, melons, papaya, pineapple.

Herbal teas, garlic, turmeric, and cinnamon are highly recommended. These are loaded with anti-inflammatory and immune system boosting properties.

These foods help in rebuilding the damaged tissues, inhibit or reduce inflammation, and alleviate the pain associated with arthritis.

On the other hand, these foods also help in reducing weight, improving the immune system, and overall health which can ease the treatment of the condition.

Essential nutrients for managing arthritis

Vitamin D

Vitamin D is one of the essential nutrients that helps in absorbing calcium and building strong bones. It also regulates the immune system to fight arthritis from inside.

Vitamin D is also vital for muscle movement, fighting inflammation, and enriching the communication between nerves. These attributes help reduce the associated symptoms and reduce stress.

Sun and yogurt are the best sources of vitamin D. Supplements are also recommended to fuel the body with significant amounts of vitamin D.

Omega-3 fatty acids

Omega-3 fatty acids help in reducing inflammation and improving the immune system. These essential fatty acids increase the blood flow, reduce joint pains, inflammation, tenderness, swelling, and discomfort in the joints and knees.

Seafood is an excellent source of omega-3 fatty acids. Generally, supplements of omega-3 fatty acids are also recommended for those who do not eat fish. You can also get them from nuts and seeds.

Glucosamine, magnesium, and calcium are other essential nutrients that should be taken in significant amounts to treat arthritis.

Warm water therapy

Warm water therapy regulates the size of blood vessels. It, in turn, increases the flow of blood, oxygen, and essential nutrients to repair the damaged tissues. It also eases the stiffness and gives more relaxation to the joints and muscles.

How to do it:

  • Relax in a tub of warm water or soak your legs in warm water.
  • The other way is to apply a warm compress to the affected area.

Avoid this therapy if you see any redness or swelling. You could also try a cold compress which has a similar effect. The cold compress does the opposite and reduces the blood flow and decreases inflammation.

Massage

Arthritis, as mentioned above, can limit mobility or cause severe pain when the affected person does physical workout. In such cases, massage therapy can be helpful.
Regular massage can soothe the pain, improve blood flow, increase flexibility, and strengthen the joints and muscles.

Besides, massage is known to inhibit the stress hormone and boost the serotonin levels, which can improve the mood.

Walking or swimming are also other physical exercises that can help in treating arthritis. Go easy on this routine or work out under an expert’s guidance. These simple exercises can reduce fat, improve coordination, mood, and quality of life.

What to Avoid

treating arthritis naturally what to avoid

A person who has arthritis should take necessary measures to avoid the following:

  • Flour and wheat products
  • Red meat
  • Alcohol
  • Processed or starchy foods
  • High-sugar foods
  • High-intensity exercises
  • Weight-lifting

Conclusion

These tips on treating arthritis can naturally help in alleviating the symptoms, reducing the pain, and treating the condition in the early stages. These remedies also help in improving the everyday physical and mental abilities.

However, in severe or chronic conditions of arthritis, it is a must to get proper treatment from a doctor. Otherwise, it could cause permanent damage and progress to other parts.

See Also: Words of Advice To Keep Your Joints Healthy

Author Bio:

Henna is a wellness lifestyle writer. She loves sharing her thoughts and personal
experiences related to natural remedies, Ayurvedic, yoga and fitness through her
writing. She currently writes for How To Cure. She can connect with others experiencing
health concerns and help them through their recovery journeys through natural
remedies

The post Treating Arthritis Naturally: Home Remedies For Arthritis and Joint Pain appeared first on Dumb Little Man.

Fiddler Labs, SRI and Berkeley experts open up the black box of machine learning at TC Sessions: Robotics+AI

As AI permeates the home, work, and public life, it’s increasingly important to be able to understand why and how it makes its decisions. Explainable AI isn’t just a matter of hitting a switch, though; Experts from UC Berkeley, SRI, and Fiddler Labs will discuss how we should go about it on stage at TC Sessions: Robotics+AI on March 3.

What does explainability really mean? Do we need to start from scratch? How do we avoid exposing proprietary data and methods? Will there be a performance hit? Whose responsibility will it be, and who will ensure it is done properly?

On our panel addressing these questions and more will be two experts, one each from academia and private industry.

Trevor Darrell is a professor at Berkeley’s Computer Science department who helps lead many of the university’s AI-related labs and projects, especially those concerned with the next generation of smart transportation. His research group focuses on perception and human-AI interaction, and he previously led a computer vision group at MIT.

Krishna Gade has passed in his time through Facebook, Pinterest, Twitter and Microsoft, and has seen firsthand how AI is developed privately — and how biases and flawed processes can lead to troubling results. He co-founded Fiddler as an effort to address problems of fairness and transparency by providing an explainable AI framework for enterprise.

Moderating and taking part in the discussion will be SRI International’s Karen Myers, director of the research outfit’s Artificial Intelligence Center and an AI developer herself focused on collaboration, automation, and multi-agent systems.

Save $50 on tickets when you book today. Ticket prices go up at the door and are selling fast. We have two (yes two) Startup Demo Packages Left – book your package now and get your startup in front of 1000+ of today’s leading industry minds. Packages come with 4 tickets – book here.