From University of Washington (US) : “Accurate protein structure prediction now accessible to all”

From University of Washington (US)

1

July 15, 2021

Leila Gray
206.475.9809
leilag@uw.edu

New artificial intelligence software can compute protein structures in 10 minutes.

Scientists have waited months for access to high-accuracy protein structure prediction since DeepMind presented remarkable progress in this area at the 2020 Critical Assessment of Structure Prediction, or CASP14, conference. The wait is now over.

Researchers at the Institute for Protein Design at the University of Washington School of Medicine in Seattle have largely recreated the performance achieved by DeepMind on this important task. These results are published online today, July 15, by the journal Science.

Unlike DeepMind, the UW Medicine team has already made their method, dubbed RoseTTAFold, freely available. Scientists from around the world are now using it to build protein models to accelerate their own research. Soon after its recent upload, the program was downloaded from GitHub by over 140 independent research teams.

2
Researchers used artificial intelligence to generate hundreds of new protein structures, including this 3D view of human interleukin-12 bound to its receptor. Credit: Ian Haydon.

Proteins consist of strings of amino acids that fold up into intricate microscopic shapes.These unique shapes in turn give rise to nearly every chemical process inside living organisms. By better understanding protein shapes, scientists can speed up the development of new treatments for cancer, COVID-19, and thousands of other medical disorders.

“It has been a busy year at the Institute for Protein Design, designing COVID-19 therapeutics and vaccines and launching these into clinical trials, along with developing RoseTTAFold for high accuracy protein structure prediction. I am delighted that the scientific community is already using the RoseTTAFold server to solve outstanding biological problems,” said senior author David Baker, Howard Hughes Medical Institute Investigator, professor of biochemistry, and director of the Institute for Protein Design at UW Medicine.

U Washington,Dr. David Baker, Baker Lab.

David Baker’s Rosetta@home project, a project running on BOINC software from UC Berkeley

Rosetta@home BOINC project screen saver

BOINC-Berkeley Open Infrastructure for Network Computing

BOINC

My BOINC stats

In the new study, a team of computational biologists led by Baker developed a software tool called RoseTTAFold that uses deep learning to quickly and accurately predict protein structures based on limited information. Without the aid of such software, it can take years of laboratory work to determine the structure of just one protein.

RoseTTAFold, on the other hand, can reliably compute a protein structure in as little as 10 minutes on a single gaming computer.The team used RoseTTAFold to compute hundreds of new protein structures, including many poorly understood proteins from the human genome. They also generated structures directly relevant to human health, including for proteins associated with problematic lipid metabolism, inflammation disorders, and cancer cell growth. And they show that RoseTTAFold can be used to build models of complex biological assemblies in a fraction of the time previously required.

RoseTTAFold is a “three-track” neural network, meaning it simultaneously considers patterns in protein sequences, how a protein’s amino acids interact with one another, and a protein’s possible three-dimensional structure. In this architecture, one-, two-, and three-dimensional information flows back and forth, allowing the network to collectively reason about the relationship between a protein’s chemical parts and its
folded structure.

“We hope this new tool will continue to benefit the entire research community,” said
Minkyung Baek, a postdoctoral scholar who led the project in the Baker laboratory at UW Medicine.

This work was supported in part by Microsoft, Open Philanthropy Project, Schmidt
Futures, Washington Research Foundation, National Science Foundation, Wellcome
Trust, and the National Institute of Health. A full list of supporters is available in the Science paper.

See the full article here .


five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

u-washington-campus

The University of Washington (US) is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

The University of Washington (US) is a public research university in Seattle, Washington, United States. Founded in 1861, University of Washington is one of the oldest universities on the West Coast; it was established in downtown Seattle approximately a decade after the city’s founding to aid its economic development. Today, the university’s 703-acre main Seattle campus is in the University District above the Montlake Cut, within the urban Puget Sound region of the Pacific Northwest. The university has additional campuses in Tacoma and Bothell. Overall, University of Washington encompasses over 500 buildings and over 20 million gross square footage of space, including one of the largest library systems in the world with more than 26 university libraries, as well as the UW Tower, lecture halls, art centers, museums, laboratories, stadiums, and conference centers. The university offers bachelor’s, master’s, and doctoral degrees through 140 departments in various colleges and schools, sees a total student enrollment of roughly 46,000 annually, and functions on a quarter system.

University of Washington is a member of the Association of American Universities(US) and is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation(US), UW spent $1.41 billion on research and development in 2018, ranking it 5th in the nation. As the flagship institution of the six public universities in Washington state, it is known for its medical, engineering and scientific research as well as its highly competitive computer science and engineering programs. Additionally, University of Washington continues to benefit from its deep historic ties and major collaborations with numerous technology giants in the region, such as Amazon, Boeing, Nintendo, and particularly Microsoft. Paul G. Allen, Bill Gates and others spent significant time at Washington computer labs for a startup venture before founding Microsoft and other ventures. The University of Washington’s 22 varsity sports teams are also highly competitive, competing as the Huskies in the Pac-12 Conference of the NCAA Division I, representing the United States at the Olympic Games, and other major competitions.

The university has been affiliated with many notable alumni and faculty, including 21 Nobel Prize laureates and numerous Pulitzer Prize winners, Fulbright Scholars, Rhodes Scholars and Marshall Scholars.

In 1854, territorial governor Isaac Stevens recommended the establishment of a university in the Washington Territory. Prominent Seattle-area residents, including Methodist preacher Daniel Bagley, saw this as a chance to add to the city’s potential and prestige. Bagley learned of a law that allowed United States territories to sell land to raise money in support of public schools. At the time, Arthur A. Denny, one of the founders of Seattle and a member of the territorial legislature, aimed to increase the city’s importance by moving the territory’s capital from Olympia to Seattle. However, Bagley eventually convinced Denny that the establishment of a university would assist more in the development of Seattle’s economy. Two universities were initially chartered, but later the decision was repealed in favor of a single university in Lewis County provided that locally donated land was available. When no site emerged, Denny successfully petitioned the legislature to reconsider Seattle as a location in 1858.

In 1861, scouting began for an appropriate 10 acres (4 ha) site in Seattle to serve as a new university campus. Arthur and Mary Denny donated eight acres, while fellow pioneers Edward Lander, and Charlie and Mary Terry, donated two acres on Denny’s Knoll in downtown Seattle. More specifically, this tract was bounded by 4th Avenue to the west, 6th Avenue to the east, Union Street to the north, and Seneca Streets to the south.

John Pike, for whom Pike Street is named, was the university’s architect and builder. It was opened on November 4, 1861, as the Territorial University of Washington. The legislature passed articles incorporating the University, and establishing its Board of Regents in 1862. The school initially struggled, closing three times: in 1863 for low enrollment, and again in 1867 and 1876 due to funds shortage. University of Washington awarded its first graduate Clara Antoinette McCarty Wilt in 1876, with a bachelor’s degree in science.

19th century relocation

By the time Washington state entered the Union in 1889, both Seattle and the University had grown substantially. University of Washington’s total undergraduate enrollment increased from 30 to nearly 300 students, and the campus’s relative isolation in downtown Seattle faced encroaching development. A special legislative committee, headed by University of Washington graduate Edmond Meany, was created to find a new campus to better serve the growing student population and faculty. The committee eventually selected a site on the northeast of downtown Seattle called Union Bay, which was the land of the Duwamish, and the legislature appropriated funds for its purchase and construction. In 1895, the University relocated to the new campus by moving into the newly built Denny Hall. The University Regents tried and failed to sell the old campus, eventually settling with leasing the area. This would later become one of the University’s most valuable pieces of real estate in modern-day Seattle, generating millions in annual revenue with what is now called the Metropolitan Tract. The original Territorial University building was torn down in 1908, and its former site now houses the Fairmont Olympic Hotel.

The sole-surviving remnants of Washington’s first building are four 24-foot (7.3 m), white, hand-fluted cedar, Ionic columns. They were salvaged by Edmond S. Meany, one of the University’s first graduates and former head of its history department. Meany and his colleague, Dean Herbert T. Condon, dubbed the columns as “Loyalty,” “Industry,” “Faith”, and “Efficiency”, or “LIFE.” The columns now stand in the Sylvan Grove Theater.

20th century expansion

Organizers of the 1909 Alaska-Yukon-Pacific Exposition eyed the still largely undeveloped campus as a prime setting for their world’s fair. They came to an agreement with Washington’s Board of Regents that allowed them to use the campus grounds for the exposition, surrounding today’s Drumheller Fountain facing towards Mount Rainier. In exchange, organizers agreed Washington would take over the campus and its development after the fair’s conclusion. This arrangement led to a detailed site plan and several new buildings, prepared in part by John Charles Olmsted. The plan was later incorporated into the overall University of Washington campus master plan, permanently affecting the campus layout.

Both World Wars brought the military to campus, with certain facilities temporarily lent to the federal government. In spite of this, subsequent post-war periods were times of dramatic growth for the University. The period between the wars saw a significant expansion of the upper campus. Construction of the Liberal Arts Quadrangle, known to students as “The Quad,” began in 1916 and continued to 1939. The University’s architectural centerpiece, Suzzallo Library, was built in 1926 and expanded in 1935.

After World War II, further growth came with the G.I. Bill. Among the most important developments of this period was the opening of the School of Medicine in 1946, which is now consistently ranked as the top medical school in the United States. It would eventually lead to the University of Washington Medical Center, ranked by U.S. News and World Report as one of the top ten hospitals in the nation.

In 1942, all persons of Japanese ancestry in the Seattle area were forced into inland internment camps as part of Executive Order 9066 following the attack on Pearl Harbor. During this difficult time, university president Lee Paul Sieg took an active and sympathetic leadership role in advocating for and facilitating the transfer of Japanese American students to universities and colleges away from the Pacific Coast to help them avoid the mass incarceration. Nevertheless many Japanese American students and “soon-to-be” graduates were unable to transfer successfully in the short time window or receive diplomas before being incarcerated. It was only many years later that they would be recognized for their accomplishments during the University of Washington’s Long Journey Home ceremonial event that was held in May 2008.

From 1958 to 1973, the University of Washington saw a tremendous growth in student enrollment, its faculties and operating budget, and also its prestige under the leadership of Charles Odegaard. University of Washington student enrollment had more than doubled to 34,000 as the baby boom generation came of age. However, this era was also marked by high levels of student activism, as was the case at many American universities. Much of the unrest focused around civil rights and opposition to the Vietnam War. In response to anti-Vietnam War protests by the late 1960s, the University Safety and Security Division became the University of Washington Police Department.

Odegaard instituted a vision of building a “community of scholars”, convincing the Washington State legislatures to increase investment in the University. Washington senators, such as Henry M. Jackson and Warren G. Magnuson, also used their political clout to gather research funds for the University of Washington. The results included an increase in the operating budget from $37 million in 1958 to over $400 million in 1973, solidifying University of Washington as a top recipient of federal research funds in the United States. The establishment of technology giants such as Microsoft, Boeing and Amazon in the local area also proved to be highly influential in the University of Washington’s fortunes, not only improving graduate prospects but also helping to attract millions of dollars in university and research funding through its distinguished faculty and extensive alumni network.

21st century

In 1990, the University of Washington opened its additional campuses in Bothell and Tacoma. Although originally intended for students who have already completed two years of higher education, both schools have since become four-year universities with the authority to grant degrees. The first freshman classes at these campuses started in fall 2006. Today both Bothell and Tacoma also offer a selection of master’s degree programs.

In 2012, the University began exploring plans and governmental approval to expand the main Seattle campus, including significant increases in student housing, teaching facilities for the growing student body and faculty, as well as expanded public transit options. The University of Washington light rail station was completed in March 2015, connecting Seattle’s Capitol Hill neighborhood to the University of Washington Husky Stadium within five minutes of rail travel time. It offers a previously unavailable option of transportation into and out of the campus, designed specifically to reduce dependence on private vehicles, bicycles and local King County buses.

University of Washington has been listed as a “Public Ivy” in Greene’s Guides since 2001, and is an elected member of the American Association of Universities. Among the faculty by 2012, there have been 151 members of American Association for the Advancement of Science, 68 members of the National Academy of Sciences(US), 67 members of the American Academy of Arts and Sciences, 53 members of the National Academy of Medicine(US), 29 winners of the Presidential Early Career Award for Scientists and Engineers, 21 members of the National Academy of Engineering(US), 15 Howard Hughes Medical Institute Investigators, 15 MacArthur Fellows, 9 winners of the Gairdner Foundation International Award, 5 winners of the National Medal of Science, 7 Nobel Prize laureates, 5 winners of Albert Lasker Award for Clinical Medical Research, 4 members of the American Philosophical Society, 2 winners of the National Book Award, 2 winners of the National Medal of Arts, 2 Pulitzer Prize winners, 1 winner of the Fields Medal, and 1 member of the National Academy of Public Administration. Among UW students by 2012, there were 136 Fulbright Scholars, 35 Rhodes Scholars, 7 Marshall Scholars and 4 Gates Cambridge Scholars. UW is recognized as a top producer of Fulbright Scholars, ranking 2nd in the US in 2017.

The Academic Ranking of World Universities (ARWU) has consistently ranked University of Washington as one of the top 20 universities worldwide every year since its first release. In 2019, University of Washington ranked 14th worldwide out of 500 by the ARWU, 26th worldwide out of 981 in the Times Higher Education World University Rankings, and 28th worldwide out of 101 in the Times World Reputation Rankings. Meanwhile, QS World University Rankings ranked it 68th worldwide, out of over 900.

U.S. News & World Report ranked University of Washington 8th out of nearly 1,500 universities worldwide for 2021, with University of Washington’s undergraduate program tied for 58th among 389 national universities in the U.S. and tied for 19th among 209 public universities.

In 2019, it ranked 10th among the universities around the world by SCImago Institutions Rankings. In 2017, the Leiden Ranking, which focuses on science and the impact of scientific publications among the world’s 500 major universities, ranked University of Washington 12th globally and 5th in the U.S.

In 2019, Kiplinger Magazine’s review of “top college values” named University of Washington 5th for in-state students and 10th for out-of-state students among U.S. public colleges, and 84th overall out of 500 schools. In the Washington Monthly National University Rankings University of Washington was ranked 15th domestically in 2018, based on its contribution to the public good as measured by social mobility, research, and promoting public service.

From North Carolina State University (US) : “Making Citizen Science Inclusive Will Require More Than Rebranding”

NC State bloc

From North Carolina State University (US)

June 24, 2021
Laura Oleniacz

1
A tree in the Court of North Carolina, tagged with a QR code, allows people to answer questions for a citizen science project as part of a College of Natural Resources research project. Credit: North Carolina State University.

Scientists need to focus on tangible efforts to boost equity, diversity and inclusion in citizen science, researchers from North Carolina State University argued in a new perspective.

Published in the journal Science, the perspective is a response to a debate about rebranding “citizen science,” the movement to use crowdsourced data collection, analysis or design in research. Researchers said that while the motivation for rebranding is in response to a real concern, there will be a cost to it, and efforts to make projects more inclusive should go deeper than that. Their recommendations speak to a broader discussion about how to ensure science is responsive to the needs of a diverse audience.

“At its heart, citizen science is a system of knowledge production that doesn’t block entry based on credentials,” said first author Caren Cooper, associate professor of forestry and environmental resources at NC State. “Those of us in citizen science have been saying ‘science is for everyone, you don’t need a degree or special training.’ But, the sad irony is that it hasn’t been for everyone. The overwhelming majority of participants resemble their academic counterparts, who are often white, affluent and have advanced degrees. We want to take the good intentions that are driving rebranding, and commit to long-term, sustained efforts to reimagine an inclusive citizen science.”

The term “citizen science” was coined in the 1990s, researchers said, to describe science led by institutions that use volunteers to collect data. It has evolved to encompass many types of projects with public involvement in design, leadership or data collection and analysis. As a “citizen science campus,” there are projects underway at NC State in which undergraduates, faculty, staff and the general public can help collect data. Examples include projects that rely on volunteers to help figure out the microbial content of sourdough bread or detect the presence of lead pipes in homes around the state.

In an effort to resolve concerns that the term is exclusionary to people who do not have citizenship status in a given nation, some organizations have moved toward using the term “community science,” among other names. But researchers said community science is a distinct and existing research movement led and designed by communities, rather than institutions, to address environmental or social justice problems.

“It’s a huge dis to community science to flippantly change the name like it isn’t already being utilized, and could be considered disrespectful to people who are doing this work and have been for many years,” said co-author Zakiya Leggett, assistant professor of forestry and environmental resources. “If you have a citizen science project, but you advertise it as ‘community science,’ it does a disservice to both practices.”

In addition, there is a cost to losing the term “citizen science,” they said, since the term has gained momentum globally. In the United States, the term is used in a federal law authorizing the government to include volunteers in scientific research irrespective of their credentials and citizenship status.

“There is a lot of work that has gone toward incorporating ‘citizen science’ as a part of policy, as well as being accepted into mainstream science,” said co-author Madhusudan Katti, associate professor of forestry and environmental resources at NC State. “The name has been caught up in politicization of citizenship and nationalist politics, and rebranding is a little bit reactive. The concern is genuine, but the fix is not deep enough. Renaming something doesn’t make it different from what it’s been all along.”

The researchers argued for strategic planning to advance accessibility, justice, equity, diversity and inclusion in citizen science.

“One approach that could work for citizen science is ‘centering in the margins.’ That can include centering research agendas based on the areas that are underserved by science,” Cooper said.

Other tactics could involve ensuring there are diverse perspectives in project leadership, or overcoming economic barriers to participation. They also said there is a need for funding to support science that addresses interests, concerns and needs of people who have historically or are currently underserved by science.

They said rebranding, if needed, should only happen if it is called for as part of a broader strategic plan. They also said rebranding efforts should refrain from co-opting existing terminology, avoid exporting issues in the United States to the rest of the world, and identify terminology to help further clarify distinctions for different types of projects.

“We wanted the fact that diversity and inclusion in citizen science remains elusive to serve as a canary in the coal mine to the rest of the scientific community – it takes far more than words and good intentions to be inclusive,” Cooper said. “We can learn from community science without co-opting it. We need to figure this out without expecting quick-fix solutions, because those can do more harm than good.”

The perspective, “Inclusion in citizen science: The conundrum of rebranding,” was published online in Science [above]. In addition to Cooper, Katti and Leggett, other authors included Chris L. Hawn, Lincoln R. Larson, Julia K. Parrish, Gillian Bowser, Darlene Cavalier, Robert R. Dunn, Mordechai (Muki) Haklay, Kaberia Kar Gupta, Na’Taki Osborne Jelks, Valerie A. Johnson, Omega R. Wilson and Sacoby Wilson. Researchers reported funding from the National Science Foundation (US), through grant No. 1713562, to Cooper and Larson.

Interested in a participation in Citizen Science? Here are some ideas:

All of these will use your device without any interruption in your use of your device.

World Community Grid [WCG] is the home for many projects which depend on a community of “crunchers” to process data on home computers, cell phones, etc. You sign up here, you look at the projects and you attach to those in which you are interested. Here are some examples:

WCG Help Stop TB
2

WCG Mapping Cancer Markers
3

WCG Smash Childhood Cancer
4

WCG Microbiome Immunity Project
5

WCG OpenPandemics – COVID-19
1

WCG Africa Rainfall Project
6

World Community Grid is a philanthropic initiative of IBM Corporate Citizenship

Another great place to start in BOINC– The Berkeley Open Infrastructure for Network Computing.

There are many projects in Basic and Applied Science available at the BOINC website. Visit the site, download the software to your device, use BOINC directives to configure your computer for optimal usage and attach to those projects which you would like to help.

My BOINC stats

See the full article here .

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

NC State campus

North Carolina State University (US) was founded with a purpose: to create economic, societal and intellectual prosperity for the people of North Carolina and the country. We began as a land-grant institution teaching the agricultural and mechanical arts. Today, we’re a pre-eminent research enterprise that excels in science, technology, engineering, math, design, the humanities and social sciences, textiles and veterinary medicine.

North Carolina State University (US) students, faculty and staff take problems in hand and work with industry, government and nonprofit partners to solve them. Our 34,000-plus high-performing students apply what they learn in the real world by conducting research, working in internships and co-ops, and performing acts of world-changing service. That experiential education ensures they leave here ready to lead the workforce, confident in the knowledge that NC State consistently rates as one of the best values in higher education.

From University of Zürich (Universität Zürich) (CH): “Research Fascinates Non-Academics Too”

From University of Zürich (Universität Zürich) (CH)

22 Jun 2021

Citizen Science

Around half of the Swiss population is interested in actively taking part in academic research. Social and environment topics are among the most popular.

1
There is great interest in citizen science: around half of the Swiss population can imagine participating in participatory research. Image: Anna Yang, Fachhochschule Nordwestschweiz.

Whether it’s collecting hydrological data or measuring biodiversity, there are a host of UZH projects where people with a thirst for knowledge can get involved – even if they don’t have an academic degree or university background. The Citizen Science Center Zürich, run jointly by UZH and ETH Zürich, has been developing and promoting citizen science projects together with the Participatory Science Academy since 2017.

First-ever Swiss-wide data

Citizen science is based on the idea that anyone can participate in research. But who might be willing to spend their time on participatory research projects, and under what conditions? To date there was no well-founded data available on the overall willingness of Swiss people to get involved in such research. “We wanted to fill this gap with a representative study to be able to identify our target groups even better,” explains Susanne Tönsmann, managing director of the Participatory Science Academy.

The study was developed by the Participatory Science Academy, the UZH Department of Communication and Media Research and the FHNW School of Social Work. A total of 1,394 people over the age of 18 in Switzerland were surveyed for the study.

The key findings include:

Awareness:

8% of respondents are familiar with the term “citizen science”, and 15% know the term “participatory research”.

Participation:

5% of respondents have taken part in a citizen science project before.
48% of respondents could imagine taking part in participatory research. A majority (83%) would be prepared to invest at least a few hours a month for this. People who are particularly interested in citizen science mainly include young people, people with higher levels of education, and people who are open to scientific topics.
When it comes to the reasons for not participating, 40% stated that they lacked the required knowledge, 29% said they didn’t have time, and 27% said they weren’t interested.

Research tasks:

“Collecting and classifying data” is a popular task (50%), followed by “interpreting results” (43%) and “co-determining a research question” (33%).

Topics:

Social and environmental topics are most popular (55%), followed by the environment/animals (49%), technology/natural sciences (48%), medicine/health (44%) and art/culture (21%).
______________________________________________________________________________________________________________Major Avenues to Citizen Science

World Community Grid

WCGLarge

From World Community Grid (WCG)

World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”

BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

CAN ONE PERSON MAKE A DIFFERENCE? YOU BET!!

Please visit the project pages-

Microbiome Immunity Project

FightAIDS@home Phase II

3

3

Help Stop TB
WCG Help Stop TB
Outsmart Ebola together

Outsmart Ebola Together

5
Mapping Cancer Markers

6
Mapping Cancer Markers

BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

7
Uncovering Genome Mysteries

Say No to Schistosoma

GO Fight Against Malaria

Drug Search for Leishmaniasis

Computing for Clean Water

The Clean Energy Project

Discovering Dengue Drugs – Together

Help Cure Muscular Dystrophy

Help Fight Childhood Cancer

Help Conquer Cancer

Human Proteome Folding

FightAIDS@Home

faah-1-new-screen-saver

World Community Grid is a social initiative of IBM Corporation
IBM Corporation
ibm

IBM – Smarter Planet
sp

Visit the BOINC web page, click on Choose projects and check out some of the very worthwhile studies you will find. Then click on Download and run BOINC software/ All Versons. Download and install the current software for your 32bit or 64bit system, for Windows, Mac or Linux. When you install BOINC, it will install its screen savers on your system as a default. You can choose to run the various project screen savers or you can turn them off. Once BOINC is installed, in BOINC Manager/Tools, click on “Add project or account manager” to attach to projects. Many BOINC projects are listed there, but not all, and, maybe not the one(s) in which you are interested. You can get the proper URL for attaching to the project at the projects’ web page(s) BOINC will never interfere with any other work on your computer.

MAJOR PROJECTS RUNNING ON BOINC SOFTWARE

SETI@home The search for extraterrestrial intelligence. “SETI (Search for Extraterrestrial Intelligence) is a scientific area whose goal is to detect intelligent life outside Earth. One approach, known as radio SETI, uses radio telescopes to listen for narrow-bandwidth radio signals from space. Such signals are not known to occur naturally, so a detection would provide evidence of extraterrestrial technology.

Radio telescope signals consist primarily of noise (from celestial sources and the receiver’s electronics) and man-made signals such as TV stations, radar, and satellites. Modern radio SETI projects analyze the data digitally. More computing power enables searches to cover greater frequency ranges with more sensitivity. Radio SETI, therefore, has an insatiable appetite for computing power.

Previous radio SETI projects have used special-purpose supercomputers, located at the telescope, to do the bulk of the data analysis. In 1995, David Gedye proposed doing radio SETI using a virtual supercomputer composed of large numbers of Internet-connected computers, and he organized the SETI@home project to explore this idea. SETI@home was originally launched in May 1999.”

SETI@home is the birthplace of BOINC software. Originally, it only ran in a screensaver when the computer on which it was installed was doing no other work. With the power and memory available today, BOINC can run 24/7 without in any way interfering with other ongoing work.

SETI@home, a BOINC [Berkeley Open Infrastructure for Network Computing] project originated in the Space Science Lab at UC Berkeley.

The famous SET@home screen saver, a beauteous thing to behold.

einstein@home The search for pulsars. “Einstein@Home uses your computer’s idle time to search for weak astrophysical signals from spinning neutron stars (also called pulsars) using data from the LIGO gravitational-wave detectors, the Arecibo radio telescope, and the Fermi gamma-ray satellite. Einstein@Home volunteers have already discovered more than a dozen new neutron stars, and we hope to find many more in the future. Our long-term goal is to make the first direct detections of gravitational-wave emission from spinning neutron stars. Gravitational waves were predicted by Albert Einstein almost a century ago, but have never been directly detected. Such observations would open up a new window on the universe, and usher in a new era in astronomy.”

MilkyWay@Home Milkyway@Home uses the BOINC platform to harness volunteered computing resources, creating a highly accurate three dimensional model of the Milky Way galaxy using data gathered by the Sloan Digital Sky Survey. This project enables research in both astroinformatics and computer science.”

Leiden Classical “Join in and help to build a Desktop Computer Grid dedicated to general Classical Dynamics for any scientist or science student!”

World Community Grid (WCG) World Community Grid is a special case at BOINC. WCG is part of the social initiative of IBM Corporation and the Smarter Planet. WCG has under its umbrella currently eleven disparate projects at globally wide ranging institutions and universities. Most projects relate to biological and medical subject matter. There are also projects for Clean Water and Clean Renewable Energy. WCG projects are treated respectively and respectably on their own at this blog. Watch for news.

David Baker’s Rosetta@home project, a project running on BOINC software from UC Berkeley
Rosetta@home BOINC project

Rosetta@home “Rosetta@home needs your help to determine the 3-dimensional shapes of proteins in research that may ultimately lead to finding cures for some major human diseases. By running the Rosetta program on your computer while you don’t need it you will help us speed up and extend our research in ways we couldn’t possibly attempt without your help. You will also be helping our efforts at designing new proteins to fight diseases such as HIV, Malaria, Cancer, and Alzheimer’s….
GPUGrid.net “GPUGRID.net is a distributed computing infrastructure devoted to biomedical research. Thanks to the contribution of volunteers, GPUGRID scientists can perform molecular simulations to understand the function of proteins in health and disease.” GPUGrid is a special case in that all processor work done by the volunteers is GPU processing. There is no CPU processing, which is the more common processing. Other projects (Einstein, SETI, Milky Way) also feature GPU processing, but they offer CPU processing for those not able to do work on GPU’s.

gif

These projects are just the oldest and most prominent projects. There are many others from which you can choose.

There are currently some 300,000 users with about 480,000 computers working on BOINC projects That is in a world of over one billion computers. We sure could use your help.

My BOINC

My BOINC

______________________________________________________________________________________________________________
See the full article here .

Please help promote STEM in your local schools.


Stem Education Coalition

The University of Zürich (Universität Zürich) (CH), located in the city of Zürich, is the largest university in Switzerland, with over 26,000 students. It was founded in 1833 from the existing colleges of theology, law, medicine and a new faculty of philosophy.

Currently, the university has seven faculties: Philosophy, Human Medicine, Economic Sciences, Law, Mathematics and Natural Sciences, Theology and Veterinary Medicine. The university offers the widest range of subjects and courses of any Swiss higher education institutions.

Since 1833

As a member of the League of European Research Universities (EU) (LERU) and Universitas 21 (U21) network, the University of Zürich belongs to Europe’s most prestigious research institutions. In 2017, the University of Zürich became a member of the Universitas 21 (U21) network, a global network of 27 research universities from around the world, promoting research collaboration and exchange of knowledge.

Numerous distinctions highlight the University’s international renown in the fields of medicine, immunology, genetics, neuroscience and structural biology as well as in economics. To date, the Nobel Prize has been conferred on twelve UZH scholars.

Sharing Knowledge

The academic excellence of the University of Zürich brings benefits to both the public and the private sectors not only in the Canton of Zürich, but throughout Switzerland. Knowledge is shared in a variety of ways: in addition to granting the general public access to its twelve museums and many of its libraries, the University makes findings from cutting-edge research available to the public in accessible and engaging lecture series and panel discussions.

1. Identity of the University of Zürich

Scholarship

The University of Zürich (UZH) is an institution with a strong commitment to the free and open pursuit of scholarship.

Scholarship is the acquisition, the advancement and the dissemination of knowledge in a methodological and critical manner.

Academic freedom and responsibility

To flourish, scholarship must be free from external influences, constraints and ideological pressures. The University of Zürich is committed to unrestricted freedom in research and teaching.

Academic freedom calls for a high degree of responsibility, including reflection on the ethical implications of research activities for humans, animals and the environment.

Universitas

Work in all disciplines at the University is based on a scholarly inquiry into the realities of our world

As Switzerland’s largest university, the University of Zürich promotes wide diversity in both scholarship and in the fields of study offered. The University fosters free dialogue, respects the individual characteristics of the disciplines, and advances interdisciplinary work.

2. The University of Zurich’s goals and responsibilities

Basic principles

UZH pursues scholarly research and teaching, and provides services for the benefit of the public.

UZH has successfully positioned itself among the world’s foremost universities. The University attracts the best researchers and students, and promotes junior scholars at all levels of their academic career.

UZH sets priorities in research and teaching by considering academic requirements and the needs of society. These priorities presuppose basic research and interdisciplinary methods.

UZH strives to uphold the highest quality in all its activities.
To secure and improve quality, the University regularly monitors and evaluates its performance.

Research

UZH contributes to the increase of knowledge through the pursuit of cutting-edge research.

UZH is primarily a research institution. As such, it enables and expects its members to conduct research, and supports them in doing so.

While basic research is the core focus at UZH, the University also pursues applied research.

From Simons Foundation: “A Software, a Community and a Different Way to Do Science” rosetta@home

From Simons Foundation
July 30, 2020
Susan Reslewic Keatley, Ph.D.

1
Rendering of a composite structure depicting the interaction between the ACE2/BOAT1 complex (blue/yellow) from PDB ID 6M17, and the SARS-CoV-2 spike ectodomain structure (pink, teal) from PDB ID 6VYB. Credit: P. Douglas Renfrew/Flatiron Institute.

The suite of software tools collectively known Rosetta@home is defined not only by what it does, but also by a community of scientists who are changing how collaborations thrive and move science forward.

2
Main laboratories and institutions in the RosettaCommons and basic facts about the software. Figure first appeared in PLOS Computational Biology.

Rising up against this computational tower of Babel is Rosetta, a suite of software tools for macromolecular modeling and design. Like its namesake, the Rosetta stone, which gave the modern world a key to deciphering ancient hieroglyphs, Rosetta was first intended as a key for deciphering proteins, the building blocks of life. Designed originally to predict individual protein structure, the software has broadened in scope: It can now help scientists map complex interactions between proteins and design novel proteins. It can also boost a whole host of other biological applications in fields from medicine to synthetic materials to climate science. With 500 developers at over 70 academic institutions worldwide, Rosetta is defined not only by what it does, but also by a community of scientists who are changing how science is done and how collaborations thrive and move science forward.

“Rosetta was born in the wild, the raw and the unstructured,” recalls Richard Bonneau, a group leader for systems biology at the Flatiron Institute’s Center for Computational Biology. As a student in David Baker’s biochemistry lab at the University of Washington in the mid-’90s, he and several members of the lab sat down to write a code that would predict protein structure — solving a problem that had long eluded researchers. With 3.1 million lines of code and over 35,000 licenses, the Rosetta of 2020 looks very different from the one Bonneau helped craft 25 years ago. What remains the same, however, is the intent to build a standardized, shareable code that anyone can use, and to grow a cohesive community to further evolve and strengthen the code base.

David Baker’s Rosetta@home project, a project running on BOINC software from UC Berkeley

In an internship program, college students can spend a summer in a Rosetta lab, sandwiched between a week at the Coding Boot Camp at the University of North Carolina and a week at the summer RosettaCon.
“David Baker had a view early on that this community would meet regularly and that the code would be centralized,” says Roland Dunbrack, a Rosetta principal investigator and a professor in the Molecular Therapeutics Program at the Fox Chase Cancer Center.

U Washington Dr. David Baker

Our knowledge of biology has transformed over the last few decades, but the fundamental relationship between a molecule’s structure and function is still a guiding principle of discovery-driven biological research. Rosetta assesses the structure of proteins and other biological molecules — whether natural or designed — by considering all aspects of a molecule’s conformation, from how the individual atoms attract or repel each other to how segments of a molecule can move freely in space. It then selects the structure with the lowest free energy. This information is critical for scientists working to decipher protein structure and function. Recently, improved structure prediction and a burst of new applications have ballooned Rosetta’s offerings to include over 80 distinct methods for macromolecular modeling, as reported on June 1 in Nature Methods — a milestone that represents a boon to the scientific world.

Communicating all of Rosetta’s capabilities is one of the many challenges of managing a colossal software suite and a community of thousands of users. The recent Nature Methods paper is an important step toward that goal, however, serving as a catalog for the community of Rosetta users and the larger scientific community, says Julia Koehler Leman, one of the paper’s first authors and a research scientist in systems biology in Bonneau’s group at the Flatiron Institute. With over 100 authors, the paper reviews Rosetta’s advances over the last five years, with an emphasis on major scientific applications, user interfaces and usability.

The Nature Methods resource also highlights Rosetta’s approach to several unique challenges to modeling and understanding in biology. Take membrane proteins, which are targets for 60% of the pharmaceuticals on the market despite making up just 30% of all human proteins. Because they are hard to work with in the lab, they make up a tiny fraction of the proteins available in structure databases, which Rosetta uses for its prediction algorithms. An additional obstacle is that Rosetta was developed for proteins in water rather than those embedded within cell membranes, which are ‘greasy’ and water insoluble. As a postdoctoral fellow who had also worked in an experimental membrane proteins lab during graduate school, Koehler Leman worked with colleagues to adapt Rosetta to the membrane environment. “The training I had experimentally with membrane proteins shaped how I develop code,” Koehler Leman says, and led her to emphasize ease of the user interface in her coding. Rosetta now offers an array of capabilities for modeling the characteristics of membrane proteins, including protein-protein docking and design.

Antibodies, the proteins of the immune system, are another challenge for Rosetta. Unlike other proteins, they contain loop regions that can confound structure prediction. They are also known to make split-second changes when binding to an antigen, making them difficult to predict and model. A large collaboration of researchers, including Jeffrey Gray, a Rosetta principal investigator and a professor of chemical and biomolecular engineering at Johns Hopkins University, has succeeded in creating Rosetta methods to predict the structure of an antibody from its sequence, and then model the interaction of the antibody with its antigen. Understanding these interactions is critical for developing therapeutic antibodies or vaccines. Motivated by COVID-19, Gray, Dunbrack and other Rosetta developers are thinking about how to most effectively design antibodies to combat this and future pandemics. “Our collaborations through Rosetta have given us deep internal knowledge of antibodies,” says Gray. “The synergistic and positive nature of this community has helped us accelerate science.”

4
eXtreme Rosetta Workshops (XRWs) are organized annually and have had a drastic positive impact on both the software and the community. Image first appeared in PLOS Computational Biology.

Rosetta has expanded beyond proteins, to RNA and DNA. RNA structure in particular presents challenges distinct from those of proteins. Loops with irregular nucleotide pairing abound, and the method Rosetta uses for proteins flounders in the presence of RNA; multiple possible energy minima can confound the overall energetic view of a conformation, much the way deep potholes on a hill might mislead an altimeter. Rosetta developers have demonstrated RNA structure prediction, as well as RNA- and DNA-protein binding, by modeling the molecules in a step-by-step fashion, in essence sacrificing computational expense for accuracy. Several of the leading COVID-19 vaccine candidates, including two of those selected for Operation Warp Speed, initiated by the federal government to accelerate vaccine development against COVID-19, are DNA- or RNA-based. This underscores the importance of making tools available to probe nucleic acids and how they bind to proteins.

Rosetta’s modular nature is its secret weapon: Scientists can build a dizzying array of workflows from the thousands of available code classes. “There are things we can do with Rosetta that we can’t do otherwise, like design proteins so stable they are more like nonliving materials and integrate high-throughput computation with high-throughput experiments,” says Bonneau. The sheer size of both the software itself and the worldwide community can, at times, feel unwieldy, added Bonneau, but ultimately it is necessary for solving big scientific problems.

Rosetta’s licensing agreement is unique in that most of the fees paid by pharmaceutical companies flow back to the RosettaCommons, the community of developers, to support code maintenance and community building. “You can think of Rosetta as a multi-institution research group, with money,” says Dunbrack. “There are lots of consortiums out there, but not as many with their own source of funds.” Recently, the corporate licensing agreement was changed so companies can contribute code back to Rosetta. “This change says a lot about where our tools are, and how the community and the science are evolving,” says Brian Weitzner, another of the Nature Methods paper’s first authors and a senior scientist at Lyell Immunopharma, a company Baker co-founded.

Maintaining the code takes great effort and coordination. Each time a developer submits a piece of code, it has to be integrated into the entire Rosetta suite. “Individual code development branches are merged into the software several times a day,” says Koehler Leman, “so we need to continually test the software to make sure it won’t break.” The benefits make this effort worthwhile, says Bonneau. “For whatever you want to do, whether with DNA, RNA, drugs or surfaces, you might just have two to 10 people in the community writing a code in the same framework.” RosettaCommons issues its own grants to members of the community for code maintenance, something scientific research grants won’t often cover.

The emphasis on documentation and interface development aims to make Rosetta more user-friendly and a benchmark for how people can develop powerful software in any community, says Koehler Leman. Detailed user instructions, called protocol captures, accompany each new addition of code, and three different language interfaces (C++, Python and command line) are available to developers. For the general public, including K-12 students, the video game Foldit offers a chance to play with protein structure, with terms like ‘rubber bands’ for restraints and ‘shake’ for rotating parts of a molecule. Foldit’s 700,000 regular users routinely solve real-world scientific structure puzzles, including a challenge this past February to design a protein to inhibit the spike protein on the new coronavirus, with the top results selected for experimental testing in labs.

To rally the community around the arduous tasks of standardized documentation and code curation, RosettaCommons holds a meeting (RosettaCon) each summer and winter, hack-a-thons for code maintenance and improvement, and boot camps to train junior developers. It also grants an annual Rosetta Service Award for contributions to code maintenance or community leadership. A conversation in 2012 between Weitzner and Andrew Leaver-Fay, now an assistant professor of biochemistry and biophysics at the University of North Carolina School of Medicine and Matthew O’Meara, now a research assistant professor of computational medicine and bioinformatics at the University of Michigan Medical School, led to the creation of the boot camp. “We noticed postdocs spent a year and a half learning to program, and we said, let’s have a class. I’ve learned that when you ask, ‘What if we did this differently?’ the community is so supportive and the response is, ‘Yeah, let’s do it,’” says Weitzner, who worked in Dunbrack’s lab in high school and college, in Gray’s lab in graduate school, and in Baker’s lab as a postdoc.

In an internship program, college students can spend a summer in a Rosetta lab, sandwiched between a week at the Coding Boot Camp at the University of North Carolina and a week at the summer RosettaCon in Washington state. “They start to foster this community right when you come in,” says former intern and current Coding Boot Camp teaching assistant Anna Yaschenko, who graduated from the University of Maryland this year with a dual major in computer science and bioinformatics. “RosettaCon is so casual — it allows people to connect in ways you couldn’t at a typical conference. I was surprised at how tight-knit the community was despite being so large.”

A post-baccalaureate program starts this summer, Gray says, and all five participants are from groups underrepresented in STEM fields. Rosetta’s diversity, equity and inclusion committee has encouraged Rosetta principal investigators, students and postdocs to attend conferences like the Annual Biomedical Research Conference for Minority Students; oSTEM, a professional society for LGBTQ people in STEM fields; and the Grace Hopper Celebration of Women in Computing conference. “Diversity in research programs is important because it’s fair, and everyone should have the opportunity to participate,” says Dunbrack. RosettaCommons recently put out a statement on Black Lives Matter that included action items for individuals and labs to combat racism. “We had so many people weighing in,” says Gray, “saying ‘this is important,’ or ‘here’s this subtlety.’ There’s still a lot of work to do, but I was very proud of our community for their serious engagement.”

As a software and a community, Rosetta represents a different way to do science. “We really believe the best idea wins, no matter where it comes from,” says Koehler Leman. “At our conferences, people are less worried about being right or wrong, and more concerned with ‘Does something work or not?’”

“Other research communities can benefit from this approach,” says Weitzner, “and collaborate more and not worry as much about competing.” The challenge will be to continue to balance innovation with standardization as the software and community grow. “We’ve got to maintain the quality and continuity of the code, while integrating new methods and research into Rosetta,” says Bonneau. “New problems in biology have a scale and complexity that demand this kind of collaboration.”

[I participated in rosetta@home as a BOINC cruncher for a number of years.]

My BOINC

See the full article here.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition


Mission and Model

The Simons Foundation’s mission is to advance the frontiers of research in mathematics and the basic sciences.

Co-founded in New York City by Jim and Marilyn Simons, the foundation exists to support basic — or discovery-driven — scientific research undertaken in the pursuit of understanding the phenomena of our world.

The Simons Foundation’s support of science takes two forms: We support research by making grants to individual investigators and their projects through academic institutions, and, with the launch of the Flatiron Institute in 2016, we now conduct scientific research in-house, supporting teams of top computational scientists.

From The New York Times: “The Search for E.T. Goes on Hold, for Now”

From The New York Times

March 23, 2020
Dennis Overbye

A popular screen saver takes a break while its inventors try to digest data that may yet be hiding news of extraterrestrials.

SETI@home, a BOINC project originated in the Space Science Lab at UC Berkeley

The seti@home screensaver, launched in May 1999, crunched data while your computer was idle.
is
One of the great science fiction fantasies of all time — that you might discover aliens texting you from outer space on your computer — is about to take a breather.

For the last 21 years ordinary people, armchair astronomers, citizen scientists, sitting at home or in their offices, were able to participate in the search for extraterrestrial intelligence — SETI — thanks to a screen saver called seti@home. Once installed, the program would periodically download data from the University of California, Berkeley, process it while the computer was idle, and then send it back.

[Some of this is currently inaccurate. While seti@home began its life as a screen saver, it became a full fledged project running on BOINC software, under CPU control similar to other projects like Rosetta@home and World Community Grid. BOINC software has become the basis for all sorts of distributed computing science projects. Most current statistic of the size of the BOINC world is 24-hour average: 28.495 PetaFLOPS, which today would place BOINC at No. 5 in the TOP500 if distributed computing was included in the mix, which it is not.


David Baker’s Rosetta@home project, a project running on BOINC software from UC Berkeley


I was a BOINC cruncher for about 6 years. Below is my record of achievement. As you can see, I was in the 99th percentile of all BOINC for all time.

My BOINC

On March 2, the ringleaders of the seti@home effort, a beleaguered and somewhat diminished band of Berkeley astronomers, announced on their website that they were taking a break. On March 31 the program will stop sending out data and go into “hibernation.” The team, they explained, needs time to digest its decades of findings.

The suspension of new data mining removes yet another pleasant diversion that some of us — there were about 100,000 seti@home members at last count — could pursue during our social distancing prompted by the coronavirus pandemic.

Launched in May 1999, the program was one of the first great innovations of a then young internet, one of the first and most popular efforts to crowdsource difficult computations. It allowed you to imagine that you might one day receive a spam call or email from a real-estate agency on some asteroid, or a little green salesman trying to sell you black hole insurance.

I was an early and enthusiastic adopter of seti@home. I spent many a slack moment — that is to say, most of my moments — staring at the shifting mountain range of graphics that appeared on my office screen, constantly rearranging themselves in mysterious ways. I wondered what, if anything, they were saying — if someday the news that we are not alone would have my computer to thank.

Participating gave me the same feeling as being at NASA’s Jet Propulsion Laboratory during Voyager’s planetary encounters.

NASA JPL

In those wonderful days, images beamed back from the spacecraft of moons, rings and other baffling phenomena in the outer solar system appeared on screens in the reporters’ newsroom at the same time that scientists, huddled in their offices, saw them for the first time.

We were united in our ignorance and our curiosity, wondering what the universe held in store for us that day.

We still don’t know. But the search for extraterrestrial intelligence has become a much more hopeful endeavor since 1960 when Frank Drake, now a retired professor at the University of California, Santa Cruz, pointed a radio telescope at two nearby stars in the hope of catching an interstellar broadcast.

Frank Drake with his Drake Equation. Credit Frank Drake

Drake Equation, Frank Drake, Seti Institute

He thought he heard something, and then he didn’t, which has been the story of the search ever since: thousands of stars, millions of radio frequencies, cosmic silence, the Great Silence.

Billions of stars, trillions of frequencies to go.

Green Bank Radio Telescope, West Virginia, USA, now the center piece of the GBO, Green Bank Observatory, being cut loose by the NSF


NAIC Arecibo Observatory operated by University of Central Florida, Yang Enterprises and UMET, Altitude 497 m (1,631 ft).

The National Radio Astronomy Observatory’s Robert C. Byrd Green Bank Telescope in rural Pocahontas County, W.Va., one of two telescopes — the other the Arecibo radio telescope in Puerto Rico — whose observational data Seti@Home processed.Credit…Jim West/Alamy

The logic of this endeavor is as unassailable as its prospects are rickety. Sentient beings anywhere in the galaxy, having reached a certain level of technological sophistication, would realize that the distances between stars are physically unbridgeable and would likely choose to communicate with radio waves.

But joining the cosmic conversation, if there is such a thing, would require us humans on the listening end to know which of 100 billion stars to point our receivers at, and which frequency to tune in to. That’s an optimistic scenario. And, of course, we would have to be able to figure out what they are saying once we heard it.

We now know that there are billions of other planets in the Milky Way galaxy alone. Thanks to efforts like NASA’s TESS satellite, we are beginning to discern some details of the closest ones. We know that they can look at us just as we are looking at them.

These days, one of the most extensive searches is being made by Breakthrough Listen, a program underwritten by the billionaire Yuri Milner and his friends.

Breakthrough Listen Project

1

UC Observatories Lick Autmated Planet Finder, fully robotic 2.4-meter optical telescope at Lick Observatory, situated on the summit of Mount Hamilton, east of San Jose, California, USA



GBO radio telescope, Green Bank, West Virginia, USA

CSIRO/Parkes Observatory, located 20 kilometres north of the town of Parkes, New South Wales, Australia

SKA Meerkat telescope, 90 km outside the small Northern Cape town of Carnarvon, SA

Newly added

CfA/VERITAS, a major ground-based gamma-ray observatory with an array of four Čerenkov Telescopes for gamma-ray astronomy in the GeV – TeV energy range. Located at Fred Lawrence Whipple Observatory,Mount Hopkins, Arizona, US in AZ, USA, Altitude 2,606 m (8,550 ft)

The effort, headquartered at the University of California, Berkeley, uses a giant 100-meter-diameter radio dish in Arecibo, Puerto Rico, among others. Seti@home has been piggybacking on those telescopes, looking at whatever they are looking at.

Once upon a time, almost 2 million computers were subscribed to the program, but it has since declined twentyfold. As the seti@home team explained in a recent conference call, they have been able to gauge the average lifetime of personal computers by how long they remain registered on the website — about three years.

All this has not happened without a few ruffled feathers. Legend has it that some I.T. administrators have found their networks bogged down by too many people running the screen saver at once. Dan Werthimer, who holds the Watson and Marilyn Alberts SETI Chair at the University of California, Berkeley, said this was overblown. Once, he said, a school administrator got in trouble after downloading seti@home to all the computers in the school. But after 21 years, the team doesn’t yet know whether their screen saver recorded any alien signals.

“Our resources have been limited,” Eric Korpela, the current director of the seti@home program, said.

A couple of years into the program, the team went to the Arecibo radio telescope with a list of promising signals worth checking out, to no avail. Now there are 20 billion events — it would be presumptuous to call them “signals” — awaiting another look.

In the meantime the team, never large, has shrunk to Dr. Werthimer, Dr. Korpela, David Anderson, the project’s founding director, and Jeff Cobb, who developed much of its software. In the recent phone call, they said they had been too busy keeping the computer servers running over the years to actually analyze all that data, and it is weighing on their minds. If they don’t take a break and do it now, they never will.

“We’re getting older,” Dr. Korpela said. “Some of us are retiring.”

“We haven’t published,” Dr. Werthimer said. “Our colleagues let us know about it every time we see them at scientific conferences.” He added, “We’ll keep working on results. One of them may be from E.T. We don’t know.” Noting that most of the sky had been seen many times, he said, “You might not be the only one who saw it.”

Dr. Drake once speculated that SETI was most likely to tap into cosmic religious radio broadcasts of the sort that predominate if you happen to be driving cross country. Personally, I’m steeling myself for a birdlike voice warning me that the warranty on my antimatter drive is about to expire.

Yearning for companionship is eternal. Even if it comes with cosmic spam.

See the full article here .

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

From SETI@home: Winter 2019 News Letter

SETI@home
From SETI@home

1

I’ve worked at Berkeley’s SETI Research Center for 25 years and co-founded SETI@home.

1

Thank you for your efforts as a member of the SETI@home team in 2019. Our program of searching for intelligent extraterrestrial life continues to expand, but SETI@home still needs your help.

We are putting the finishing touches on our Nebula software suite which will analyze all results from both SETI@home and SERENDIP VI. We are focusing our first efforts on a complete analysis of all SETI@home Arecibo data to date.


NAIC Arecibo Observatory operated by University of Central Florida, Yang Enterprises and UMET, Altitude 497 m (1,631 ft).

As you can imagine, it is difficult to quantify the quality and sensitivity of our analysis given that there are no known ETIs to use as a reference! So part of the design of Nebula is to generate a large number of synthetic ETI-like signals, called birdies. Our set of birdies ranges from those that model stationary transmitters on far off planets to transmitters orbiting around a variety of planet types. We are also looking into using machine learning for anomaly detection.

Two major papers will come out of this analysis. One will be on SETI@home as an instrument and the other will present the analysis in detail.

This year saw the further commissioning and improving of SERENDIP VI / FASTBurst, deployed on the FAST radio telescope in China – now the largest on the planet.

FAST [Five-hundred-meter Aperture Spherical Telescope] radio telescope, with phased arrays from CSIRO engineers Australia [located in the Dawodang depression in Pingtang County, Guizhou Province, south China

Our instrument is dual purpose, looking for both ETI and Fast Radio Bursts (FRBs).

FRB Fast Radio Bursts from NAOJ Subaru, Mauna Kea, Hawaii, USA

FRBs are transient radio pulses of short duration caused by some as yet unknown astrophysical process. During one exciting testing session we detected repeating FRB 121102, a rare repeating FRB. The detection demonstrates the sensitivity of our instrument as this faint signal is detectable by very few telescopes/instruments.

We continue to obtain raw data from Berkeley’s Breakthrough Listen program.

Breakthrough Listen Project

1

UC Observatories Lick Autmated Planet Finder, fully robotic 2.4-meter optical telescope at Lick Observatory, situated on the summit of Mount Hamilton, east of San Jose, California, USA



GBO radio telescope, Green Bank, West Virginia, USA

CSIRO/Parkes Observatory, located 20 kilometres north of the town of Parkes, New South Wales, Australia

SKA Meerkat telescope, 90 km outside the small Northern Cape town of Carnarvon, SA

Newly added

CfA/VERITAS, a major ground-based gamma-ray observatory with an array of four 12m optical reflectors for gamma-ray astronomy in the GeV – TeV energy range. Located at Fred Lawrence Whipple Observatory, Mount Hopkins, Arizona, US in AZ, USA, Altitude 2,606 m (8,550 ft)

At Green Bank, observing is about to migrate from looking at stars within our own galaxy to observing other galaxies. Meanwhile, at Parkes, we will be surveying the galactic plane. During this survey the raw “voltage” data from the telescope will be recorded. These data will be ideal for processing by SETI@home volunteers like you.

To accomplish our goals for next year, SETI@home needs two things. First, we need you, and your friends and family. Please spread the word about SETI@home and encourage people to participate. Second, SETI@home needs the funding to obtain the hardware and develop software required to handle new data sources.

How to donate

See the full article here.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

The science of SETI@home
SETI (Search for Extraterrestrial Intelligence) is a scientific area whose goal is to detect intelligent life outside Earth. One approach, known as radio SETI, uses radio telescopes to listen for narrow-bandwidth radio signals from space. Such signals are not known to occur naturally, so a detection would provide evidence of extraterrestrial technology.

Radio telescope signals consist primarily of noise (from celestial sources and the receiver’s electronics) and man-made signals such as TV stations, radar, and satellites. Modern radio SETI projects analyze the data digitally. More computing power enables searches to cover greater frequency ranges with more sensitivity. Radio SETI, therefore, has an insatiable appetite for computing power.

Previous radio SETI projects have used special-purpose supercomputers, located at the telescope, to do the bulk of the data analysis. In 1995, David Gedye proposed doing radio SETI using a virtual supercomputer composed of large numbers of Internet-connected computers, and he organized the SETI@home project to explore this idea. SETI@home was originally launched in May 1999.

SETI@home is not a part of the SETI Institute

The SET@home screensaver image

SETI@home, a BOINC project originated in the Space Science Lab at UC Berkeley

To participate in this project, download and install the BOINC software on which it runs. Then attach to the project. While you are at BOINC, look at some of the other projects which you might find of interest.

My BOINC

From Max Planck Institute for Gravitational Physics: “Pulsating gamma rays from neutron star rotating 707 times a second”

From Max Planck Institute for Gravitational Physics

September 19, 2019

Media contact

Dr. Benjamin Knispel
Press Officer AEI Hannover
Phone:+49 511 762-19104
Fax:+49 511 762-17182
benjamin.knispel@aei.mpg.de

Science contacts
Lars Nieder
Phone:+49 511 762-17491
Fax:+49 511 762-2784
lars.nieder@aei.mpg.de

Prof. Dr. Bruce Allen
Director
Phone:+49 511 762-17148
Fax:+49 511 762-17182
bruce.allen@aei.mpg.de

1
A black widow pulsar and its small stellar companion, viewed within their orbital plane. Powerful radiation and the pulsar’s “wind” – an outflow of high-energy particles — strongly heat the facing side of the star to temperatures twice as hot as the sun’s surface. The pulsar is gradually evaporating its partner, which fills the system with ionized gas and prevents astronomers from detecting the pulsar’s radio beam most of the time. NASA’s Goddard Space Flight Center/Cruz deWilde

Second fastest spinning radio pulsar known is a gamma-ray pulsar, too. Multi-messenger observations look closely at the system and raise new questions.

An international research team led by the Max Planck Institute for Gravitational Physics (Albert Einstein Institute; AEI) in Hannover has discovered that the radio pulsar J0952-0607 also emits pulsed gamma radiation. J0952-0607 spins 707 times in one second and is 2nd in the list of rapidly rotating neutron stars. By analyzing about 8.5 years worth of data from NASA’s Fermi Gamma-ray Space Telescope, LOFAR radio observations from the past two years, observations from two large optical telescopes, and gravitational-wave data from the LIGO detectors, the team used a multi-messenger approach to study the binary system of the pulsar and its lightweight companion in detail.

Gran Telescopio Canarias at the Roque de los Muchachos Observatory on the island of La Palma, in the Canaries, Spain, sited on a volcanic peak 2,267 metres (7,438 ft) above sea level
TFC HiPERCAM mounted on the Gran Telescopio Canarias,
ESO/NTT at Cerro La Silla, Chile, at an altitude of 2400 metres
ESO La Silla NTT ULTRACAM is an ultra fast camera capable of capturing some of the most rapid astronomical events. It can take up to 500 pictures a second in three different colours simultaneously. It was designed and built by scientists from the Universities of Sheffield and Warwick (United Kingdom), in collaboration with the UK Astronomy Technology Centre in Edinburgh. ULTRACAM employs the latest in charged coupled device (CCD) detector technology in order to take, store and analyse data at the required sensitivities and speeds. CCD detectors can be found in digital cameras and camcorders, but the devices used in ULTRACAM are special because they are larger, faster and most importantly, much more sensitive to light than the detectors used in today’s consumer electronics products. Since it was built, it has operated at the William Herschel Telescope, the New Technology Telescope, and the Very Large Telescope. It is now permanently mounted on the Thai National Telescope.

NASA/Fermi LAT

NASA/Fermi Gamma Ray Space Telescope

ASTRON LOFAR European Map

ASTRON LOFAR Radio Antenna Bank, Netherlands

Their study published in The Astrophysical Journal shows that extreme pulsar systems are hiding in the Fermi catalogues and published in the Astrophysical Journal today shows that extreme pulsar systems are hiding in the Fermi catalogues and motivates further searches. Despite being very extensive, the analysis also raises new unanswered questions about this system.

MIT /Caltech Advanced aLigo

Pulsars are the compact remnants of stellar explosions which have strong magnetic fields and are rapidly rotating.

Women in STEM – Dame Susan Jocelyn Bell Burnell

Dame Susan Jocelyn Bell Burnell, discovered pulsars with radio astronomy. Jocelyn Bell at the Mullard Radio Astronomy Observatory, Cambridge University, taken for the Daily Herald newspaper in 1968. Denied the Nobel.
Dame Susan Jocelyn Bell Burnell at work on first plusar chart 1967 pictured working at the Four Acre Array in 1967. Image courtesy of Mullard Radio Astronomy Observatory.
Dame Susan Jocelyn Bell Burnell 2009
Dame Susan Jocelyn Bell Burnell (1943 – ), still working from http://www. famousirishscientists.weebly.com

They emit radiation like a cosmic lighthouse and can be observable as radio pulsars and/or gamma-ray pulsars depending on their orientation towards Earth.

The fastest pulsar outside globular clusters

PSR J0952-0607 (the name denotes the position in the sky) was first discovered in 2017 by radio observations of a source identified by the Fermi Gamma-ray Space Telescope as possibly being a pulsar. No pulsations of the gamma rays in data from the Large Area Telescope (LAT) onboard Fermi had been detected. Observations with the radio telescope array LOFAR identified a pulsating radio source and – together with optical telescope observations – allowed to measure some properties of the pulsar. It is orbiting the common center of mass in 6.2 hours with a companion star that only weighs a fiftieth of our Sun. The pulsar rotates 707 times in a single second and is therefore the fastest spinning in our Galaxy outside the dense stellar environments of globular clusters.

Searching for extremely faint signals

Using this prior information on the binary pulsar system, Lars Nieder, a PhD student at the AEI Hannover, set out to see if the pulsar also emitted pulsed gamma rays. “This search is extremely challenging because the Fermi gamma-ray telescope only registered the equivalent of about 200 gamma rays from the faint pulsar over the 8.5 years of observations. During this time the pulsar itself rotated 220 billion times. In other words, only once in every billion rotations was a gamma ray observed!” explains Nieder. “For each of these gamma rays, the search must identify exactly when during each of the 1.4 millisecond rotations it was emitted.”

This requires combing through the data with very fine resolution in order not to miss any possible signals. The computing power required is enormous. The very sensitive search for faint gamma-ray pulsations would have taken 24 years to complete on a single computer core. By using the Atlas computer cluster at the AEI Hannover it finished in just 2 days.

MPG Institute for Gravitational Physics Atlas Computing Cluster

A strange first detection

“Our search found a signal, but something was wrong! The signal was very faint and not quite where it was supposed to be. The reason: our detection of gamma rays from J0952-0607 had revealed a position error in the initial optical-telescope observations which we used to target our analysis. Our discovery of the gamma-ray pulsations revealed this error,” explains Nieder. “This mistake was corrected in the publication reporting the radio pulsar discovery. A new and extended gamma-ray search made a rather faint – but statistically significant – gamma-ray pulsar discovery at the corrected position.”

Having discovered and confirmed the existence of pulsed gamma radiation from the pulsar, the team went back to the Fermi data and used the full 8.5 years from August 2008 until January 2017 to determine physical parameters of the pulsar and its binary system. Since the gamma radiation from J0952-0607 was so faint, they had to enhance their analysis method developed previously to correctly include all unknowns.

3
The pulse profile (distribution of gamma-ray photons during one rotation of the pulsar) of J0952-0607 is shown at the top. Below is the corresponding distribution of the individual photons over the ten years of observations. The greyscale shows the probability (photon weights) for individual photons to originate from the pulsar. From mid 2011 on, the photons line up along tracks corresponding to the pulse profile. This shows the detection of gamma-ray pulsations, which is not possible before mid 2011. L. Nieder/Max Planck Institute for Gravitational Physics.

Another surprise: no gamma-ray pulsations before July 2011

The derived solution contained another surprise, because it was impossible to detect gamma-ray pulsations from the pulsar in the data from before July 2011. The reason for why the pulsar only seems to show pulsations after that date is unknown. Variations in how much gamma rays it emitted might be one reason, but the pulsar is so faint that it was not possible to test this hypothesis with sufficient accuracy. Changes in the pulsar orbit seen in similar systems might also offer an explanation, but there was not even a hint in the data that this was happening.

Optical observations raise further questions

The team also used observations with the ESO’s New Technology Telescope at La Silla and the Gran Telescopio Canarias on La Palma to examine the pulsar’s companion star. It is most likely tidally locked to the pulsar like the Moon to the Earth so that one side always faces the pulsar and gets heated up by its radiation. While the companion orbits the binary system’s center of mass its hot “day” side and cooler “night” side are visible from the Earth and the observed brightness and color vary.

These observations create another riddle. While the radio observations point to a distance of roughly 4,400 light-years to the pulsar, the optical observations imply a distance about three times larger. If the system was relatively close to Earth, it would feature a never-seen-before extremely compact high density companion, while larger distances are compatible with the densities of known similar pulsar companions. An explanation for this discrepancy might be the existence of shock waves in the wind of particles from the pulsar, which could lead to a different heating of the companion. More gamma-ray observations with Fermi LAT observations should help answer this question.

Searching for continuous gravitational waves

Another group of researchers at the AEI Hannover searched for continuous gravitational wave emission from the pulsar using LIGO data from the first (O1) and second (O2) observation run. Pulsars can emit gravitational waves when they have tiny hills or bumps. The search did not detect any gravitational waves, meaning that the pulsar’s shape must be very close to a perfect sphere with the highest bumps less than a fraction of a millimeter.

Rapidly rotating neutron stars

Understanding rapidly spinning pulsars is important because they are probes of extreme physics. How fast neutron stars can spin before they break apart from centrifugal forces is unknown and depends on unknown nuclear physics. Millisecond pulsars like J0952-0607 are rotating so rapidly because they have been spun up by accreting matter from their companion. This process is thought to bury the pulsar’s magnetic field. With the long-term gamma-ray observations, the research team showed that J0952-0607 has one of the ten lowest magnetic fields ever measured for a pulsar, consistent with expectations from theory.

Einstein@Home searches for test cases of extreme physics

“We will keep studying this system with gamma-ray, radio, and optical observatories since there are still unanswered questions about it. This discovery also shows once more that extreme pulsar systems are hiding in the Fermi LAT catalogue,” says Prof. Bruce Allen, Nieder’s PhD supervisor and Director at the AEI Hannover. “We are also employing our citizen science distributed computing project Einstein@Home to look for binary gamma-ray pulsar systems in other Fermi LAT sources and are confident to make more exciting discoveries in the future.”

Einstein@home, a BOINC project

See the full article here.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

The Max Planck Institute for Gravitational Physics (Albert Einstein Institute) is the largest research institute in the world specializing in general relativity and beyond. The institute is located in Potsdam-Golm and in Hannover where it is closely related to the Leibniz Universität Hannover.

From Texas Advanced Computing Center: “Science enthusiasts, researchers, and students benefit from volunteer computing using BOINC@TACC”

TACC bloc

From Texas Advanced Computing Center

June 24, 2019
Faith Singer-Villalobos

You don’t have to be a scientist to contribute to research projects in fields such as biomedicine, physics, astronomy, artificial intelligence, or earth sciences.

Using specialized, open-source software from the Berkeley Open Infrastructure for Network Computing project (BOINC), hundreds of thousands of home and work computers are being used for volunteer computing using consumer devices and organizational resources. For the past 17 years, with funding primarily from the National Science Foundation (NSF), BOINC is now used by 38 projects and more than a half a million computers running these projects around the world.

David Anderson, BOINC’s founder, is a research scientist at the University of California Berkeley Space Sciences Laboratory. His objective in creating BOINC was to build software to handle the details of distributed computing so that scientists wouldn’t have to.

“I wanted to create a new way of doing scientific computing as an alternative to grids, clusters, and clouds,” Anderson said. “As a software system, BOINC has been very successful. It’s evolved without too many growing pains to handle multi-core CPUs, all kinds of GPUs, virtual machines and containers, and Android mobile devices.”

The Texas Advanced Computing Center (TACC) started its own project in 2017 — BOINC@TACC — that supports virtualized, parallel, cloud, and GPU-based applications to allow the public to help solve science problems. BOINC@TACC is the first use of volunteer computing by a major high performance computing (HPC) center.

“BOINC@TACC is an excellent project for making the general public a key contributor in science and technology projects,” said Ritu Arora, a research scientist at TACC and the project lead.

“We love engaging with people in the community who can become science enthusiasts and connect with TACC and generate awareness of science projects,” Arora said. “And, importantly for students and researchers, there is always an unmet demand for computing cycles. If there is a way for us to connect these two communities, we’re fulfilling a major need.”

BOINC volunteer Dick Duggan is a retired IT professional who lives in Massachusetts, and a volunteer computing enthusiast for more than a decade.

“I’m a physics nerd. Those tend to be my favorite projects,” he said. “I contribute computing cycles to many projects, including the Large Hadron Collider (LHC). LHC is doing state-of-the-art physics — they’re doing physics on the edge of what we know about the universe and are pushing that edge out.”

Duggan uses his laptop, desktop, tablet, and Raspberry Pi to provide computing cycles to BOINC@TACC. “When my phone is plugged in and charged, it runs BOINC@TACC, too.”

Joining BOINC@TACC is simple: Sign up as volunteer, set up your device, and pick your projects.

Compute cycles on more than 1,300 computing devices have been volunteered for the BOINC@TACC project and more than 300 devices have processed the jobs submitted using the BOINC@TACC infrastructure. The aggregate computer power available through the CPUs on the volunteered devices is about 3.5 teraflops (or 3.5 trillion floating point operations per second).

Why BOINC@TACC?

It’s no secret that computational resources are in great demand, and that researchers with the most demanding computational requirements require supercomputing systems. Access to the most powerful supercomputers in the world, like the resources at TACC, is important for the advancement of science in all disciplines. However, with funding limitations, there is always an unmet need for these resources.

“BOINC@TACC helps fill a gap in what researchers and students need and what the open-science supercomputing centers can currently provide them,” Arora said.

Researchers from UT Austin; any of the 14 UT System institutions; and researchers around the country through XSEDE, the national advanced computing infrastructure in the U.S., are invited to submit science jobs to BOINC@TACC.

To help researchers with this unmet need, TACC started a collaboration with Anderson at UC Berkeley to see how the center could outsource high-throughput computing jobs to BOINC.

When a researcher is ready to submit projects through BOINC@TACC, all they need to do is log in to a TACC system and run a program from their account that will register them for BOINC@TACC, according to Arora. Thereafter, the researcher can continue running programs that will help them (1) decide whether BOINC@TACC is the right infrastructure for running their jobs; and (2) submit the qualified high-throughput computing jobs through the command-line interface. The researchers can also submit jobs through the web interface.

Instead of the job running on Stampede2, for example, it could run on a volunteer’s home or work computer.

“Our software matches the type of resources for a job and what’s available in the community,” Arora said. “The tightly-coupled, compute-intensive, I/O-intensive, and memory-intensive applications are not appropriate for running on the BOINC@TACC infrastructure. Therefore, such jobs are filtered out and submitted for running on Stampede2 or Lonestar5 instead of BOINC@TACC,” she clarified.

A significant number of high-throughput computing jobs are also run on TACC systems in addition to the tightly-coupled MPI jobs. These high-throughput computing jobs consist of large sets of loosely-coupled tasks, each of which can be executed independently and in parallel to other tasks. Some of these high-throughput computing jobs have modest memory and input/output needs, and do not have an expectation of a fixed turnaround time. Such jobs qualify to run on the BOINC@TACC infrastructure.

“Volunteer computing is well-suited to this kind of workload,” Anderson said. “The idea of BOINC@TACC is to offload these jobs to a BOINC server, freeing up the supercomputers for the tightly-coupled parallel jobs that need them.”

To start, the TACC team deployed an instance of the BOINC server on a cloud-computing platform. Next, the team developed the software for integrating BOINC with supercomputing and cloud computing platforms. During the process, the project team developed and released innovative software components that can be used by the community to support projects from a variety of domains. For example, a cloud-based shared filesystem and a framework for creating Docker images that was developed in this project can be useful for a variety of science gateway projects.

As soon as the project became operational, volunteers enthusiastically started signing up. The number of researchers using BOINC@TACC is gradually increasing.

Carlos Redondo, a senior in Aerospace Engineering at UT Austin, is both a developer on the BOINC@TACC project and a researcher who uses the infrastructure.

“The incentive for researchers to use volunteer computing is that they save on their project allocation,” Redondo said. “But researchers need to be mindful that the number of cores on volunteer systems are going to be small, and they don’t have the special optimization that servers at TACC have,” Redondo said.

As a student researcher, Redondo has submitted multiple computational fluid dynamics jobs through BOINC@TACC. In this field, computers are used to simulate the flow of fluids and the interaction of the fluid (liquids and gases) with surfaces. Supercomputers can achieve better solutions and are often required to solve the largest and most complex problems.

“The results in terms of the numbers produced from the volunteer devices were exactly those expected, and also identical to those running on Stampede2,” he said.

Since jobs run whenever volunteers’ computers are available, researchers’ turnaround time is longer than that of Stampede2, according to Redondo. “Importantly, if a volunteer decides to stop a job, BOINC@TACC will automatically safeguard the progress, protect the data, and save the results.”

TACC’s Technical Contribution to BOINC

BOINC software works out-of-the-box. What it doesn’t support is the software for directly accepting jobs from supercomputers.

“We’re integrating BOINC software with the software that is running on supercomputing devices so these two pieces can talk to each other when we have to route qualified high-throughput computing jobs from supercomputers to volunteer devices. The other piece TACC has contributed is extending BOINC to the cloud computing platforms,” Arora said.

Unlike other BOINC projects, the BOINC@TACC infrastructure can execute jobs on Virtual Machines (VMs) running on cloud computing systems. These systems are especially useful for GPU jobs and for assuring a certain quality of service to the researchers. “If the pool of the volunteered resources goes down, we’re able to route the jobs to the cloud computing systems and meet the expectations of the researchers. This is another unique contribution of the project,” Arora said.

BOINC@TACC is also pioneering the use of Docker to package custom-written science applications so that they can run on volunteered resources.

Furthermore, the project team is planning to collaborate with companies that may have corporate social responsibility programs for soliciting compute-cycles on their office computers or cloud computing systems.

“We have the capability to harness office desktops and laptops, and also the VMs in the cloud. We’ve demonstrated that we’re capable of routing jobs from Stampede2 to TACC’s cloud computing systems, Chameleon and Jetstream, through the BOINC@TACC infrastructure,” Arora said.

Anderson concluded, “We hope that BOINC@TACC will provide a success story that motivates other large scientific computing centers to use the same approach. This will benefit thousands of computational scientists and, we hope, will greatly increase the volunteer population.”

Dick Duggan expressed a common sentiment of BOINC volunteers that people want to do it for the love of science. “This is the least I can do. I may not be a scientist but I’m accomplishing something…and it’s fun to do,” Duggan said.

Learn More: The software infrastructure that TACC developed for routing jobs from TACC systems to the volunteer devices and the cloud computing systems is described in this paper.

BOINC@TACC is funded through NSF award #1664022. The project collaborators are grateful to TACC, XSEDE, and the Science Gateway Community Institute (SGCI) for providing the resources required for implementing this project.

Computing power
24-hour average: 17.707 PetaFLOPS.
Active: 139,613 volunteers, 590,666 computers.
Not considered for the TOP500 because it is distributed, BOINC is right now more powerful than No.9 Titan 17.590 PetaFLOPS and No 10 Sequoia 17.173 PetaFLOPS.

My BOINC

See the full article here .

Please help promote STEM in your local schools.


Stem Education Coalition

The Texas Advanced Computing Center (TACC) designs and operates some of the world’s most powerful computing resources. The center’s mission is to enable discoveries that advance science and society through the application of advanced computing technologies.

TACC Maverick HP NVIDIA supercomputer
TACC Lonestar Cray XC40 supercomputer
Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF
TACC HPE Apollo 8000 Hikari supercomputer
TACC Maverick HP NVIDIA supercomputer
TACC DELL EMC Stampede2 supercomputer

From SETI@home via The Ringer: “E.T.’s Home Phone”

SETI@home
From SETI@home

via

The Ringer

May 24, 2019
Ben Lindbergh

UC Berkeley’s SETI@home, one of the most significant citizen-science projects of the late 20th century, brought the search for intelligent life to PCs. It hasn’t yet found what it set out to, but there’s still hope.

1
Getty Images/Ringer illustration

Around the time the movie Contact came out in 1997, Kevin D., a governmental IT support and procurement employee in Toronto, saw a notice on a technical news site about a piece of software that was being developed by researchers at the University of California, Berkeley. The scientists were interested in SETI, the Search for Extraterrestrial Intelligence, and courtesy of Contact, so was Kevin D. The moment he heard about the program that would eventually come to be called SETI@home wasn’t as dramatic as Jodie Foster’s portrayal of Dr. Eleanor Arroway discovering a message sent across the universe, but it would make a major impact on the next two decades of his life. It also signaled the advent of a productive and unprecedented citizen-science project that continues today, 20 years after it launched in May 1999.

Kevin D. aspired to be a scientist as soon as he could read, but financial difficulties forced him to drop out of university, which put an end to that dream. “I could have probably gone with student loans and a few years of eating ramen, but I wasn’t in the right frame of mind anymore,” he says. “SETI@home and other distributed-computing projects have filled that need nicely, allowing me to contribute to science on a scale that would have been unimaginable just a few decades ago.”

SETI@home was the brainchild of a UC Berkeley grad student named David Gedye, who came up with the concept of using personal computers for scientific purposes in 1995. “That was the point where a lot of home computers were on the internet,” says Berkeley research scientist David Anderson, Gedye’s grad advisor and the cofounder of SETI@home. “Also the point where personal computers were becoming fast enough that they could potentially do number-crunching for scientific purposes.”

Gedye thought using computers to comb through data recorded by radio telescopes in search of signals sent by intelligent extraterrestrial life would both appeal to the public and demonstrate the potential for public participation to boost the scientific community’s processing power. He and Anderson joined forces with multiple partners in the astronomy and SETI fields, including Eric Korpela, the current director of SETI@home, and Dan Werthimer, the Berkeley SETI Research Center’s chief scientist. Werthimer was a SETI veteran who had been hunting for alien life since the 1970s and oversaw the SERENDIP program, which piggybacks on observations that radio astronomers are already conducting and scours the results for evidence that E.T. is phoning our home. SERENDIP supplied the incipient SETI@home with data from the venerable Arecibo Observatory in Puerto Rico, which until 2016 featured the world’s largest single-aperture radio telescope.

NAIC Arecibo Observatory operated by University of Central Florida, Yang Enterprises and UMET, Altitude 497 m (1,631 ft).

Fueled by $50,000 from the Planetary Society and $10,000 from a company backed by Microsoft cofounder and SETI enthusiast Paul Allen, Korpela and Anderson started designing software that would split that data into chunks that could be distributed to personal computers, processed, and sent back to Berkeley for further analysis. By the spring of ’99, SETI@home was ready to launch, despite the difficulty of making it compatible with all kinds of computers and dealing with pre-broadband internet. But its creators weren’t prepared for the outpouring of public interest that propagated through word of mouth and posts on forums and sites such as Slashdot.

“The biggest issue was not the people on dial-up connections,” Korpela recalls. “It was just the sheer number of people that were interested in SETI@home. When we started SETI@home, we planned or thought that maybe we could get 10,000 people to be interested in doing this. The day we turned it on, we had close to half a million people show up.”

In 1999, the public portion of the internet was new enough that going viral was a nearly unknown phenomenon. But Korpela says that within a month or two, SETI@home had attracted a couple million active users, which overwhelmed the modest equipment underpinning the project, causing frequent crashes. “We were planning on running our servers from a small desktop machine,” Korpela says. “That didn’t really work.” Sun Microsystems stepped in to donate more powerful hardware, and SETI@home users helped the perpetually underfunded program defray the cost of bandwidth, which was expensive at the time. In 1999, Korpela says, Berkely was paying $600 a month for each megabit per second, and SETI@home was guzzling about 25.

On the plus side, the uptick in processing power was immediately apparent. “The main benefit of the SETI@home–type processing is that it gives us about a factor-of-10 increase in sensitivity,” Korpela says. “So we can detect a signal that’s 10 times smaller than we could just using the instrumentation that’s available at the radio telescope.”

As SETI@home spread, a few of its more zealous acolytes ran afoul of the workplaces where they installed it, which the program’s creators advised users not to do without permission. In 2001, 16 employees of the Tennessee Valley Authority were reprimanded for installing the software on their office computers. (I know the feeling; my mom wasn’t pleased about the electricity costs she claimed I was incurring when she spotted the screensaver on my own early-2000s PC.) In 2002, computer technician David McOwen faced criminal charges and was ultimately put on probation when he installed SETI@home at DeKalb Technical College in Atlanta. And in 2009, network systems administrator Brad Niesluchowski lost his job after installing SETI@home on thousands of computers across an Arizona school district. (Niesluchowski, or “NEZ,” still ranks 17th on the all-time SETI@home leaderboard for data processed.) Korpela has made several SETI@home sightings in the wild, including on point-of-sale cash registers and, once, on a public computer at an Air Force base (which wasn’t Area 51).

Over the decades, SETI@home’s user base has dwindled to between 100,000 and 150,000 people, operating an average of two computers and six to eight CPUs per person. But the remaining participants’ computers are hundreds or thousands of times more powerful than they were in 1999. “When we started, we designed our work units—our data chunks going out to people—to be something that a typical PC would be able to finish computing in about a week, and a current GPU will do those in a couple of minutes,” Korpela says. SETI@home is now available via an Android app that’s used by about 12,000 participants, and even smartphones smoke turn-of-the-century desktop computers in processing speed.

The SETI@home software has evolved along with the hardware that hosts it. In the early years, the program ran as a screensaver, which served multiple purposes. First, screensavers were popular, so the software filled a need. Second, the graphical representations of the program’s activities fed users’ scientific curiosity and reassured them that the program was working as intended. And third, it functioned as eye candy that entertained users and caught the attention of anyone within visual range. Now that screensavers have fallen out of favor and more people prefer to turn off their monitors or computers when they’re not in use to save power, Anderson says, “We’ve kind of moved away from the screensaver model to the model of just running invisibly in the background while you’re at your computer.”

A shortcoming of the original SETI@home software led to a much more significant change—and, indirectly, the greatest legacy of SETI@home, at least so far. In the program’s initial form, the signal-processing logic and the code that handled displaying the screensaver and receiving and transmitting data were a package deal. “Each time we wanted to change the algorithms, to change the scientific part, we had to have all of our users download and install a new program,” Anderson says. “And then we would lose some fraction of our users each time we did that.”

The solution was separating the science part from the distributed-computing part by building a platform that could update the algorithm without requiring a reinstall. Better yet, that platform could act as a conduit for any number of alternative distributed-computing efforts. In 2002, Anderson built and released that system, which he called Berkeley Open Infrastructure for Network Computing, or BOINC.

SETI@home, which migrated to BOINC in 2005, has thus far failed in its primary purpose: to detect intelligent alien life. But it’s succeeded in its secondary goal of demonstrating the viability of distributed computing. Other researchers have emulated that model, and BOINC, which is funded primarily by the National Science Foundation, is now home to 38 active projects that are doing useful science, including investigating diseases and identifying drugs that could combat cancer, modeling climate change, and searching for phenomena such as pulsars and gravitational waves. Research conducted by BOINC-based projects has generated 150 scientific papers (and counting), and the network’s collective computing power—about 27 petaflops—makes it more powerful than all but four of the world’s individual supercomputers. Anderson, who believes volunteer computing is still underutilized by the scientific community, says it’s especially “well suited to the general area of physical simulations where you have programs that simulate physical reality, which scales anywhere from the atomic level up to the entire universe.”

See the full article here.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

The science of SETI@home
SETI (Search for Extraterrestrial Intelligence) is a scientific area whose goal is to detect intelligent life outside Earth. One approach, known as radio SETI, uses radio telescopes to listen for narrow-bandwidth radio signals from space. Such signals are not known to occur naturally, so a detection would provide evidence of extraterrestrial technology.

Radio telescope signals consist primarily of noise (from celestial sources and the receiver’s electronics) and man-made signals such as TV stations, radar, and satellites. Modern radio SETI projects analyze the data digitally. More computing power enables searches to cover greater frequency ranges with more sensitivity. Radio SETI, therefore, has an insatiable appetite for computing power.

Previous radio SETI projects have used special-purpose supercomputers, located at the telescope, to do the bulk of the data analysis. In 1995, David Gedye proposed doing radio SETI using a virtual supercomputer composed of large numbers of Internet-connected computers, and he organized the SETI@home project to explore this idea. SETI@home was originally launched in May 1999.

SETI@home is not a part of the SETI Institute

The SET@home screensaver image

SETI@home, a BOINC project originated in the Space Science Lab at UC Berkeley

To participate in this project, download and install the BOINC software on which it runs. Then attach to the project. While you are at BOINC, look at some of the other projects which you might find of interest.

My BOINC

From World Community Grid (WCG): “A Graduation, a Paper, and a Continuing Search for the ‘Help Stop TB’ Researchers”

New WCG Logo

WCGLarge

From World Community Grid (WCG)

By: Dr. Anna Croft
University of Nottingham, UK
28 Sep 2018

Summary
In this update, principal investigator Dr. Anna Croft shares two recent milestones for the Help Stop TB research team, and discusses their continuing search for additional researchers.

The Help Stop TB (HSTB) project uses the massive computing power of World Community Grid to examine part of the coating of Mycobacterium tuberculosis, the bacterium that causes tuberculosis. We hope that by learning more about the mycolic acids that are part of this coating, we can contribute to the search for better treatments for tuberculosis, which is one of the world’s deadliest diseases.

Graduation Ceremony for Dr. Athina Meletiou

In recent news for the HSTB project, Dr. Athina Meletiou has now officially graduated. It was a lovely day, finished off with some Pimms and Lemonade in the British tradition.

1
Athina (center) with supervisors Christof (left) and Anna (right)

2
Athina and her scientific “body-guard,” Christof

Search for New Team Members Continues

We are still looking for suitably qualified chemists, biochemists, mathematicians, engineers and computer scientists to join our team, especially to develop the new analytical approaches (including machine-learning approaches) to understand the substantial data generated by the World Community Grid volunteers.

We will be talking to students from our BBSRC-funded doctoral training scheme in the next few days and encouraging them to join the project. Click here for more details.

Paper Published

Dr. Wilma Groenwald, one of the founding researchers for the HSTB project, recently published a paper describing some of the precursor work to the project. The paper, which discusses the folding behavior of mycolic acids, is now freely available on ChemRXiv Revealing Solvent-Dependent Folding Behavior of Mycolic Acids from Mycobacterium Tuberculosis by Advanced Simulation Analysis.

We hope to have Athina’s first papers with World Community Grid data available later in the year, and will keep you updated.

Thank you to all volunteers for your support.

See the full article here.


five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

Ways to access the blog:
https://sciencesprings.wordpress.com
http://facebook.com/sciencesprings
World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”
WCG projects run on BOINC software from UC Berkeley.
BOINCLarge

BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

BOINC WallPaper

CAN ONE PERSON MAKE A DIFFERENCE? YOU BET!!

My BOINC
MyBOINC
“Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

Please visit the project pages-

Microbiome Immunity Project

FightAIDS@home Phase II

FAAH Phase II
OpenZika

Rutgers Open Zika

Help Stop TB
WCG Help Stop TB
Outsmart Ebola together

Outsmart Ebola Together

Mapping Cancer Markers
mappingcancermarkers2

Uncovering Genome Mysteries
Uncovering Genome Mysteries

Say No to Schistosoma

GO Fight Against Malaria

Drug Search for Leishmaniasis

Computing for Clean Water

The Clean Energy Project

Discovering Dengue Drugs – Together

Help Cure Muscular Dystrophy

Help Fight Childhood Cancer

Help Conquer Cancer

Human Proteome Folding

FightAIDS@Home

faah-1-new-screen-saver

faah-1-new

World Community Grid is a social initiative of IBM Corporation
IBM Corporation
ibm

IBM – Smarter Planet
sp