Keep your eyes peeled for cosmic debris: Andrew Westphal about Stardust@home

Sunday, May 28, 2006

Stardust is a NASA space capsule that collected samples from comet 81P/Wild (also known as “Wild 2) in deep space and landed back on Earth on January 15, 2006. It was decided that a collaborative online review process would be used to “discover” the microscopically small samples the capsule collected. The project is called Stardust@home. Unlike distributed computing projects like SETI@home, Stardust@home relies entirely on human intelligence.

Andrew Westphal is the director of Stardust@home. Wikinews interviewed him for May’s Interview of the Month (IOTM) on May 18, 2006. As always, the interview was conducted on IRC, with multiple people asking questions.

Some may not know exactly what Stardust or Stardust@home is. Can you explain more about it for us?

Stardust is a NASA Discovery mission that was launched in 1999. It is really two missions in one. The primary science goal of the mission was to collect a sample from a known primitive solar-system body, a comet called Wild 2 (pronounced “Vilt-two” — the discoverer was German, I believe). This is the first [US]] “sample return” mission since Apollo, and the first ever from beyond the moon. This gives a little context. By “sample return” of course I mean a mission that brings back extraterrestrial material. I should have said above that this is the first “solid” sample return mission — Genesis brought back a sample from the Sun almost two years ago, but Stardust is also bringing back the first solid samples from the local interstellar medium — basically this is a sample of the Galaxy. This is absolutely unprecedented, and we’re obviously incredibly excited. I should mention parenthetically that there is a fantastic launch video — taken from the POV of the rocket on the JPL Stardust website — highly recommended — best I’ve ever seen — all the way from the launch pad, too. Basically interplanetary trajectory. Absolutely great.

Is the video available to the public?

Yes [see below]. OK, I digress. The first challenge that we have before can do any kind of analysis of these interstellar dust particles is simply to find them. This is a big challenge because they are very small (order of micron in size) and are somewhere (we don’t know where) on a HUGE collector— at least on the scale of the particle size — about a tenth of a square meter. So

We’re right now using an automated microscope that we developed several years ago for nuclear astrophysics work to scan the collector in the Cosmic Dust Lab in Building 31 at Johnson Space Center. This is the ARES group that handles returned samples (Moon Rocks, Genesis chips, Meteorites, and Interplanetary Dust Particles collected by U2 in the stratosphere). The microscope collects stacks of digital images of the aerogel collectors in the array. These images are sent to us — we compress them and convert them into a format appropriate for Stardust@home.

Stardust@home is a highly distributed project using a “Virtual Microscope” that is written in html and javascript and runs on most browsers — no downloads are required. Using the Virtual Microscope volunteers can search over the collector for the tracks of the interstellar dust particles.

How many samples do you anticipate being found during the course of the project?

Great question. The short answer is that we don’t know. The long answer is a bit more complicated. Here’s what we know. The Galileo and Ulysses spacecraft carried dust detectors onboard that Eberhard Gruen and his colleagues used to first detect and them measure the flux of interstellar dust particles streaming into the solar system. (This is a kind of “wind” of interstellar dust, caused by the fact that our solar system is moving with respect to the local interstellar medium.) Markus Landgraf has estimated the number of interstellar dust particles that should have been captured by Stardust during two periods of the “cruise” phase of the interplanetary orbit in which the spacecraft was moving with this wind. He estimated that there should be around 45 particles, but this number is very uncertain — I wouldn’t be surprised if it is quite different from that. That was the long answer! One thing that I should say…is that like all research, the outcome of what we are doing is highly uncertain. There is a wonderful quote attributed to Einstein — “If we knew what we were doing, it wouldn’t be called “research”, would it?”

How big would the samples be?

We expect that the particles will be of order a micron in size. (A millionth of a meter.) When people are searching using the virtual microscope, they will be looking not for the particles, but for the tracks that the particles make, which are much larger — several microns in diameter. Just yesterday we switched over to a new site which has a demo of the VM (virtual microscope) I invite you to check it out. The tracks in the demo are from submicron carbonyl iron particles that were shot into aerogel using a particle accelerator modified to accelerate dust particles to very high speeds, to simulate the interstellar dust impacts that we’re looking for.

And that’s on the main Stardust@home website [see below]?

Yes.

How long will the project take to complete?

Partly the answer depends on what you mean by “the project”. The search will take several months. The bottleneck, we expect (but don’t really know yet) is in the scanning — we can only scan about one tile per day and there are 130 tiles in the collector…. These particles will be quite diverse, so we’re hoping that we’ll continue to have lots of volunteers collaborating with us on this after the initial discoveries. It may be that the 50th particle that we find will be the real Rosetta stone that turns out to be critical to our understanding of interstellar dust. So we really want to find them all! Enlarging the idea of the project a little, beyond the search, though is to actually analyze these particles. That’s the whole point, obviously!

And this is the huge advantage with this kind of a mission — a “sample return” mission.

Most missions rather do things quite differently… you have to build an instrument to make a measurement and that instrument design gets locked in several years before launch practically guaranteeing that it will be obsolete by the time you launch. Here exactly the opposite is true. Several of the instruments that are now being used to analyze the cometary dust did not exist when the mission was launched. Further, some instruments (e.g., synchrotrons) are the size of shopping malls — you don’t have a hope of flying these in space. So we can and will study these samples for many years. AND we have to preserve some of these dust particles for our grandchildren to analyze with their hyper-quark-gluon plasma microscopes (or whatever)!

When do you anticipate the project to start?

We’re really frustrated with the delays that we’ve been having. Some of it has to do with learning how to deal with the aerogel collectors, which are rougher and more fractured than we expected. The good news is that they are pretty clean — there is very little of the dust that you see on our training images — these were deliberately left out in the lab to collect dust so that we could give people experience with the worst case we could think of. In learning how to do the scanning of the actual flight aerogel, we uncovered a couple of bugs in our scanning software — which forced us to go back and rescan. Part of the other reason for the delay was that we had to learn how to handle the collector — it would cost $200M to replace it if something happened to it, so we had to develop procedures to deal with it, and add several new safety features to the Cosmic Dust Lab. This all took time. Finally, we’re distracted because we also have many responsibilities for the cometary analysis, which has a deadline of August 15 for finishing analysis. The IS project has no such deadline, so at times we had to delay the IS (interstellar, sorry) in order to focus on the cometary work. We are very grateful to everyone for their patience on this — I mean that very sincerely.

And rest assured that we’re just as frustrated!

I know there will be a “test” that participants will have to take before they can examine the “real thing”. What will that test consist of?

The test will look very similar to the training images that you can look at now. But.. there will of course be no annotation to tell you where the tracks are!

Why did NASA decide to take the route of distributed computing? Will they do this again?

I wouldn’t say that NASA decided to do this — the idea for Stardust@home originated here at U. C. Berkeley. Part of the idea of course came…

If I understand correctly it isn’t distributed computing, but distributed eyeballing?

…from the SETI@home people who are just down the hall from us. But as Brian just pointed out. this is not really distributed computing like SETI@home the computers are just platforms for the VM and it is human eyes and brains who are doing the real work which makes it fun (IMHO).

That said… There have been quite a few people who have expressed interested in developing automated algorithms for searching. Just because WE don’t know how to write such an algorithm doesn’t mean nobody does. We’re delighted at this and are happy to help make it happen

Isn’t there a catch 22 that the data you’re going to collect would be a prerequisite to automating the process?

That was the conclusion that we came to early on — that we would need some sort of training set to be able to train an algorithm. Of course you have to train people too, but we’re hoping (we’ll see!) that people are more flexible in recognizing things that they’ve never seen before and pointing them out. Our experience is that people who have never seen a track in aerogel can learn to recognize them very quickly, even against a big background of cracks, dust and other sources of confusion… Coming back to the original question — although NASA didn’t originate the idea, they are very generously supporting this project. It wouldn’t have happened without NASA’s financial support (and of course access to the Stardust collector). Did that answer the question?

Will a project like this be done again?

I don’t know… There are only a few projects for which this approach makes sense… In fact, I frankly haven’t run across another at least in Space Science. But I am totally open to the idea of it. I am not in favor of just doing it as “make-work” — that is just artificially taking this approach when another approach would make more sense.

How did the idea come up to do this kind of project?

Really desperation. When we first thought about this we assumed that we would use some sort of automated image recognition technique. We asked some experts around here in CS and the conclusion was that the problem was somewhere between trivial and impossible, and we wouldn’t know until we had some real examples to work with. So we talked with Dan Wertheimer and Dave Anderson (literally down the hall from us) about the idea of a distributed project, and they were quite encouraging. Dave proposed the VM machinery, and Josh Von Korff, a physics grad student, implemented it. (Beautifully, I think. I take no credit!)

I got to meet one of the stardust directors in March during the Texas Aerospace Scholars program at JSC. She talked about searching for meteors in Antarctica, one that were unblemished by Earth conditions. Is that our best chance of finding new information on comets and asteroids? Or will more Stardust programs be our best solution?

That’s a really good question. Much will depend on what we learn during this official “Preliminary Examination” period for the cometary analysis. Aerogel capture is pretty darn good, but it’s not perfect and things are altered during capture in ways that we’re still understanding. I think that much also depends on what question you’re asking. For example, some of the most important science is done by measuring the relative abundances of isotopes in samples, and these are not affected (at least not much) by capture into aerogel.

Also, she talked about how some of the agencies that they gave samples to had lost or destroyed 2-3 samples while trying to analyze them. That one, in fact, had been statically charged, and stuck to the side of the microscope lens and they spent over an hour looking for it. Is that really our biggest danger? Giving out samples as a show of good faith, and not letting NASA example all samples collected?

These will be the first measurements, probably, that we’ll make on the interstellar dust There is always a risk of loss. Fortunately for the cometary samples there is quite a lot there, so it’s not a disaster. NASA has some analytical capabilities, particularly at JSC, but the vast majority of the analytical capability in the community is not at NASA but is at universities, government labs and other institutions all over the world. I should also point out that practically every analytical technique is destructive at some level. (There are a few exceptions, but not many.) The problem with meteorites is that except in a very few cases, we don’t know where they specifically came from. So having a sample that we know for sure is from the comet is golden!

I am currently working on my Bachelor’s in computer science, with a minor in astronomy. Do you see successes of programs like Stardust to open up more private space exploration positions for people such as myself. Even though I’m not in the typical “space” fields of education?

Can you elaborate on your question a little — I’m not sure that I understand…

Well, while at JSC I learned that they mostly want Engineers, and a few science grads, and I worry that my computer science degree with not be very valuable, as the NASA rep told me only 1% of the applicants for their work study program are CS majors. I’m just curious as to your thoughts on if CS majors will be more in demand now that projects like Stardust and the Mars missions have been great successes? Have you seen a trend towards more private businesses moving in that direction, especially with President Bush’s statement of Man on the Moon in 2015?

That’s a good question. I am personally not very optimistic about the direction that NASA is going. Despite recent successes, including but not limited to Stardust, science at NASA is being decimated.

I made a joke with some people at the TAS event that one day SpaceShipOne will be sent up to save stranded ISS astronauts. It makes me wonder what kind of private redundancy the US government is taking for future missions.

I guess one thing to be a little cautious about is that despite SpaceShipOne’s success, we haven’t had an orbital project that has been successful in that style of private enterprise It would be nice to see that happen. I know that there’s a lot of interest…!

Now I know the answer to this question… but a lot do not… When samples are found, How will they be analyzed? Who gets the credit for finding the samples?

The first person who identifies an interstellar dust particle will be acknowledged on the website (and probably will be much in demand for interviews from the media!), will have the privilege of naming the particle, and will be a co-author on any papers that WE (at UCB) publish on the analysis of the particle. Also, although we are precluded from paying for travel expenses, we will invite those who discover particles AND the top performers to our lab for a hands-on tour.

We have some fun things, including micromachines.

How many people/participants do you expect to have?

About 113,000 have preregistered on our website. Frankly, I don’t have a clue how many will actually volunteer and do a substantial amount of searching. We’ve never done this before, after all!

One last thing I want to say … well, two. First, we are going to special efforts not to do any searching ourselves before we go “live”. It would not be fair to all the volunteers for us to get a jumpstart on the search. All we are doing is looking at a few random views to make sure that the focus and illumination are good. (And we haven’t seen anything — no surprise at all!) Also, the attitude for this should be “Have Fun”. If you’re not having fun doing it, stop and do something else! A good maxim for life in general!

Retrieved from “https://en.wikinews.org/w/index.php?title=Keep_your_eyes_peeled_for_cosmic_debris:_Andrew_Westphal_about_Stardust@home&oldid=4608360”
Posted in Uncategorized

Study says to clean your sponge, microwave it

Thursday, January 25, 2007

A bathing sponge.

Studies done on germs and bacteria performed by researchers at the University of Florida show that a dirty kitchen sponge can be cleaned and “sterilized” by microwaving it for 2 minutes, but researchers warn to wet the sponge first.

“People often put their sponges and scrubbers in the dishwasher, but if they really want to decontaminate them and not just clean them, they should use the microwave,” said the professor who was in charge of the study that discovered the results, Gabriel Bitton.

“Basically what we find is that we could knock out most bacteria in two minutes. The microwave is a very powerful and an inexpensive tool for sterilization,” added Bitton.

The sponges that researchers studied, were placed in “raw wastewater” and then put into a microwave to be “zapped,” according to Bitton. The wastewater was a “witch’s brew of fecal bacteria, viruses, protozoan parasites and bacterial spores, including Bacillus cereus spores,” said Bitton.

Researchers say that at least 99% of the bacteria, viruses, spores and parasites in kitchen spongees can be destroyed or “inactivated” by simply microwaving the wet sponge, on the highest power, for two minutes.

Retrieved from “https://en.wikinews.org/w/index.php?title=Study_says_to_clean_your_sponge,_microwave_it&oldid=440468”
Posted in Uncategorized

Study says dogs can smell lung and breast cancer

Monday, August 7, 2006

Dogs can be trained to detect early and late stages of lung and breast cancer accurately according to a study published by California scientists in the little-known scientific journal Integrative Cancer Therapies.

The study took place over the last five years at the Pine Street Foundation, a non-profit organization which conducts evidence-based research on integrative medicine (combining complementary and alternative medicine and mainstream medicine). Michael McCulloch and colleagues used three Labrador Retrievers and two Portuguese Water Dogs, both common pets, that received basic behavioral dog training. The researchers trained the dogs to lie down next to a sample from a cancer patient and to ignore other samples.

The samples used were breath samples from 55 patients with lung cancer and 31 with breast cancer — the two types of cancer with the highest mortality rates in the United States.

After the training phase, the dogs’ accuracy diagnosis was tested in a double-blind experiment. Among lung cancer patients, the sensitivity and specificity were 99% accurate and for breast cancer sensitivity was 88% and specificity 98%. Because these figures seem almost too good to be true, cancer experts are the same time baffled and skeptical. The authors of the study themselves also say replication of the study is needed.

Importantly, this was independent of the cancer stage, meaning the dogs were able to pick up the scent of cancer in its early stages. This is important because in many cases, the success of any treatment depends on early diagnosis. However, the researchers don’t believe this will lead to the use of dogs in the clinic soon, rather they want to find out which chemicals are actually sensed by the canines, because they could be used in laboratory assays. “It’s not like someone would start chemotherapy based on a dog test,” Dr. Gansler of the American Cancer Society said, “They’d still get a biopsy.”.

The researchers were inspired by anecdotal reports about dogs detecting cancer. In 1989, a British women consulted with her family physician because her Dalmatian kept licking a mole on her leg. At biopsy it showed to be malignant melanoma. When diagnosed too late this form of cancer has a poor survival rate, but in this case early surgery was made possible, and the women survived. Prior studies showed that breath samples from patients with lung cancer or breast cancer contain distinct biochemical markers. This provides a basis for the hypothesis that some cancer types produce volatile chemicals that dogs could smell. A study published in the British Medical Journal already proved that dogs could use their exquisite sense of smell to detect bladder cancer in urine samples, but they were only correct in 41% of cases, and another study provided preliminary evidence that dogs could detect melanomas.

This doesn’t mean you can show your breasts to your dog and it will tell you if you have cancer, other physicians caution, and scientists do not advise people to train their dogs to sniff for cancer. Unresolved issues from the study include the fact that subjects were required to breathe deeper than normal, so it’s not sure whether dogs can smell cancer in normal breath. Also, whether this is a permanent skill that would be retained by dogs was not tested.

Finally, there are concerns that could arise over liability issues: who would be responsible when the dog makes a mistake?

Current detection methods for both lung and breast cancer are not flawless. For lung cancer, chest X-ray and sputum cytology (detecting cancer cells in coughed up fluid) fail to detect many early cases, and CT scan produces many false-positive results unless combined with expensive PET scans. Although it might be comparing apples and oranges, a $2.5 million CT scanner has an accuracy of 85 to 90%. Mammography also produces false-positive results, and it may be difficult in women with dense breast tissue. As such, another type of “pet”-scan, using dogs as a biological assay, might prove feasible for screening if supported by further research. Current tests are also expensive so the use of dogs for preliminary cancer testing could prove to be an affordable alternative for countries in the developing world.

Retrieved from “https://en.wikinews.org/w/index.php?title=Study_says_dogs_can_smell_lung_and_breast_cancer&oldid=798579”
Posted in Uncategorized

Blue Security anti-spam community target of large-scale spam attack

Tuesday, May 2, 2006

Beginning Monday morning, many BlueFrog and Blue Security users began receiving an email warning them that if they did not remove their email addresses from the Blue Security registry, they would begin to receive huge amounts of unsolicited email. As quickly as four hours after the initial warning message, some users began to receive an unprecedented amount of spam. Most of the messages were simply useless text. Users reported that Blue Security’s website was unavailable or extremely slow in responding.

Blue Security is an online community dedicated to fighting spam. As they became more popular, their member list increased substantially. The members’ email address is encrypted and added to a list of e-mail addresses that wish to stop receiving spam. Blue Security maintains the encrypted list, which uses an encrypted hash function. Spammers are encouraged to remove all addressed from their email list that are also in Blue Security’s Do Not Intrude Registry by using free compliance tools available at Blue Security’s web site.

According to Blue Security’s web site, “A major spammer had started spamming our members with discouraging messages in an attempt to demoralize our community. This spammer is using mailing lists he already owns that may contain addresses of some community members.” Reportedly, Blue Security has received complaints from users about spam allegedly sent from Blue Security promoting their anti-spam solution and web site.

Blue Security states they are “an anti-spam company determined to fight spam and as such never has and never will send unsolicited email.” There are also reports of non-users of BlueSecurity/BlueFrog receiving the warning emails, which now seems is also being sent to email addresses of people who have never added their email address to Blue Security’s Do Not Intrude Registry.

Retrieved from “https://en.wikinews.org/w/index.php?title=Blue_Security_anti-spam_community_target_of_large-scale_spam_attack&oldid=565439”
Posted in Uncategorized

Atari Melbourne Project revealed

Thursday, June 9, 2005

Melbourne, Australia —Australia’s oldest video game development studio is working on a PlayStation®2 port of Eden Games’ Test Drive: Unlimited, the latest in the Test Drive series.

Atari Melbourne House has been at the centre of recent speculation after head of studio Andrew Carter took charge earlier this year, and details about their current project have been limited.

A source inside the studio has confirmed that the title in development is a PS2 port of Eden’s next-generation Xbox 360 game.

Although Melbourne House established a strong reputation for driving games with the acclaimed Test Drive: Le Mans (Sega Dreamcast), recent ventures away from the genre such as Men In Black II: Alien Escape and Transformers: Armada (targeted at the high end of the market) have failed to make an impact.

However, speculation about the studio’s long term future took another turn this week when Atari CEO James Caparro resigned. Although the ambitious natural disaster themed The Big One was cancelled when former CEO Bruno Bonnell took a back seat in November last year, the project – known inside the studio as Bonnell’s “pet” – might be revived now Bonnell has been appointed interim CEO in the wake of Caparro’s resignation.

“Who knows, it might be [The Big One], it might be another port,” the source said.

Two senior game designers have resigned from the studio in the past month.

Retrieved from “https://en.wikinews.org/w/index.php?title=Atari_Melbourne_Project_revealed&oldid=433998”
Posted in Uncategorized

IMF and EU approve aid for Georgia

Tuesday, September 16, 2008

The International Monetary Fund and the European Union approved aid packages to help Georgia recover from its conflict with Russia, which occurred in early August. The IMF approved a US$750 million loan which will allow Georgia to rebuild its currency reserves. The European Union also approved an aid package of 500 million in aid by 2010, which is expected to help internally displaced people (IDPs) and economic recovery in the form of new infrastructure. Only €100 million of the EU aid will be given to Georgia this year.

These loans are aimed to restore confidence in Georgia’s economy and send a signal to international investors that Georgia’s economy is sound. According to the IMF, international investors have been “critical to Georgia’s economic growth in recent years.”

Takatoshi Kato, Deputy Managing Director and Acting Chairman of the IMF executive committee, said the loan will “make significant resources available to replenish international reserves and bolster investor confidence, with the aim of sustaining private capital inflows that have been critical to Georgia’s economic growth in recent years.”

Georgia has requested $2 billion in international aid to help it recover from the conflict. So far, the United States has pledged $1 billion in aid. Further assistance and loans to Georgia are expected from other organizations. Kato noted that “…Georgia is expected to receive financial assistance from multilateral and bilateral donors and creditors in support of the reconstruction effort.” It is expected that an international donors’ conference will take place next month to solicit more aid for the country.

Georgia’s government expects that economic growth will be more than cut in half as a result of the conflict. Last year, Georgia’s GDP increased 12.4% and it is predicted by the IMF that growth will be less than 4 percent in the coming year.

Retrieved from “https://en.wikinews.org/w/index.php?title=IMF_and_EU_approve_aid_for_Georgia&oldid=3031841”
Posted in Uncategorized

Microsoft to buy antivirus firm Sybari Software

Tuesday, February 8, 2005 Microsoft has announced that it plans to purchase antivirus firm Sybari Software in order to expand its market within the computer security market. The takeover comes after Microsoft purchased Romanian security firm GeCad in 2003 and two months after an announcement that it is to acquire anti-spyware vendor Giant-Company Software.

Sybari is a private company based in East Newport, New York that develops antivirus software which can be used with Microsoft’s Exchange software, as well as with the Lotus Notes system.

This takeover is seen by analysts to be hostile move on Microsoft’s part, as it places the software giant in direct competition with market leaders Symantec and McAfee. Sterling Auty, an analyst with JP Morgan, told Reuters that he estimated that Microsoft could be competing for up to five percent of Symantec’s business and up to eight percent of McAfee’s revenue.

Microsoft’s shares rose $0.10 at midday to $26.26, while both McAfee and Symantec shares fell.

Retrieved from “https://en.wikinews.org/w/index.php?title=Microsoft_to_buy_antivirus_firm_Sybari_Software&oldid=438364”
Posted in Uncategorized

Wikinews interviews Rocky De La Fuente, U.S. Democratic Party presidential candidate

Thursday, March 31, 2016

Businessman Rocky De La Fuente took some time to speak with Wikinews about his campaign for the U.S. Democratic Party’s 2016 presidential nomination.

The 61-year-old De La Fuente resides in San Diego, California, grew up in Tijuana, and owns multiple businesses and properties throughout the world. Since getting his start in the automobile industry, De La Fuente has branched out into the banking and real estate markets. Despite not having held or sought political office previously, he has been involved in politics, serving as the first-ever Hispanic superdelegate to the 1992 Democratic National Convention.

De La Fuente entered the 2016 presidential race last October largely due to his dissatisfaction with Republican front-runner Donald Trump. He argues he is a more accomplished businessman than Trump, and attacks Trump as “a clown,” “a joke,” “dangerous,” and “in the same category as Hitler.” Nevertheless, De La Fuente’s business background begets comparisons with Trump. The Alaskan Midnight Sun blog described him as the Democrats’ “own Donald Trump.”

While receiving only minimal media coverage, he has campaigned actively, and according to the latest Federal Election Commission filing, loaned almost US$ 4 million of his own money to the campaign. He has qualified for 48 primary and caucus ballots, but has not yet obtained any delegates to the 2016 Democratic National Convention. Thus far, according to the count at The Green Papers, De La Fuente has received 35,406 votes, or 0.23% of the total votes cast. He leads among the many lesser-known candidates but trails both Senator Bernie Sanders who has received nearly 6.5 million votes and front-runner Hillary Clinton who has just shy of 9 million votes.

With Wikinews reporter William S. Saturn?, De La Fuente discusses his personal background, his positions on political issues, his current campaign for president, and his political future.

Retrieved from “https://en.wikinews.org/w/index.php?title=Wikinews_interviews_Rocky_De_La_Fuente,_U.S._Democratic_Party_presidential_candidate&oldid=4585942”
Posted in Uncategorized

WWE wrestler John Cena undergoes neck surgery after injury

Thursday, August 28, 2008

The Stamford, Connecticut-based company World Wrestling Entertainment issued a press release last night, stating that wrestling performer John Cena is expected to recover from his injuries, after a surgery.

WWE Medical Program head, described in a press release as a “renowned neurosurgeon” Dr. Joseph Maroon performed the operation on Cena, after he sustained injuries during the SummerSlam match on Sunday August 24. He suffered a herniated disc in his neck, in a lost match against a wrestler named Batista.

Just hours after surgery, he visited the locker room of the SmackDown/ECW taping, commenting to the WWE website that: “I feel great. Dr. Maroon is fantastic. He explained every possible procedure he could and could not perform, and the potential risks of all of them. I explained to Dr. Maroon not only my immediate goals, but also my long-term goals. He took them all into consideration and recommended the most commonplace procedure with the least amount of side effects.”

While the option of fusion surgery was chosen previously by other WWE wrestlers (Edge, Steve Austin, Chris Benoit), Maroon removed a fragment from Cena’s spinal cord, which had been blocking a nerve.

The operation lasted 90 minutes; recovery is expected to take three months, instead of a previously suggested 12-14 months.

In his WWE career, Cena is a three-time WWE Champion, a three-time United States Champion, and a two-time World Tag Team Champion. He also won the 2008 Royal Rumble. Before being promoted to the main WWE roster, Cena trained in and wrestled for Ultimate Pro Wrestling and Ohio Valley Wrestling, holding the top titles of both promotions.

Retrieved from “https://en.wikinews.org/w/index.php?title=WWE_wrestler_John_Cena_undergoes_neck_surgery_after_injury&oldid=4598208”
Posted in Uncategorized

Mysterious power failure takes down Wikipedia, Wikinews

This article mentions the Wikimedia Foundation, one of its projects, or people related to it. Wikinews is a project of the Wikimedia Foundation.

Wednesday, February 23, 2005

On Monday, at 10.14 pm UTC, the Wikimedia server cluster experienced a total power failure, taking down Wikipedia, Wikinews, and all other Wikimedia websites.

The servers are housed in a colocation facility (colo) in Tampa, Florida, USA. They occupy two racks, with each rack receiving electricity from two independent supplies. However, both supplies have circuit-breakers in them, and both opened at the same time, leading to a total power failure. All computers immediately went down. It’s normal for fire safety regulations to prohibit uninterruptible power supplies in colos, with the colo providing its own UPS and generator instead. The circuit breakers were on the computer side of this emergency power system, so none of the computers continued to receive power to survive the breaker trip or shut down safely.

The actual reason why the circuit breakers tripped is currently unknown.

When power was restored, it was discovered that most of the MySQL databases that store the data which makes up Wikipedia et al had been corrupted. The main database and the four slaves had all damaged the data on their hard disk drives beyond the ability of the auto-correction to repair. Only one copy survived safely, on a machine that is used for report generation and maintenance tasks, which remained 31 hours backlogged while catching up after an unusually heavy update load during the previous week.

Volunteer Wikimedia engineers worked through the night rebuilding the databases from the sole good copy onto the other servers. The Wikipedia database is over 180Gb in size, making the copying process last 1.5 – 2 hours for every server it was performed upon.

Regular back-ups of the database of Wikipedia projects are maintained – the encyclopedia in its entirety was not at risk. The last database download was made on February 9; all edits since then could only have been laboriously rebuilt from logs and recovered from the damaged database requiring much more time and effort.

Limited read-only service was established late Tuesday afternoon, with editing becoming possible 24 hours after the power failure. Final repairs continue now, as well as upgrades to prevent similar issues in the future. Server-intensive features, such as categories and ‘watchlists’ that display recent changes to selected articles to registered users, remain disabled to ease the load on the recovering systems.

The process which led to the damage originated with the operating system, disk controllers, or hard drives failing to flush the data correctly.

If the power to a database server is cut mid-write, the database may be corrupted and unreadable, however the operating system, hardware, and software are designed to make this very unlikely. In a previous incident in 2004 power was also lost to a server but the database was undamaged.

To avoid such damage, each database server saves a copy of an edit to be applied to the database on a separate storage system before making the actual update to the database itself. This so-called ‘write-ahead logging’ should ensure that in the event of a system crash, the database can be rebuilt from a ‘last-good’ state by replaying the edits saved in the log.

Earlier this year popular blogging site LiveJournal suffered a similar power failure when another customer at their colocation facility pressed an Emergency Power Off button, intended for use only by firefighters. The company suffered database corruption similar to that seen at Wikimedia.

LiveJournal are now fitting UPS to their servers to ensure that they have time to shut down safely in the event of a power failure. Wikimedia was said to be investigating the possibility of fitting similar equipment at the time of this failure.

Several pundits have suggested that the use of another database, such as the proprietary database Oracle or the free PostgreSQL, would have avoided the database corruption seen at the server cluster. A post-mortem of the incident show the failure was in the operating system, or the hardware, or some combination of the two. LiveJournal, which also uses MySQL, reported similar database corruption after their power cut.

The Wikimedia foundation only allows the use of free software on its systems, and future versions of the Mediawiki software will support the PostgreSQL database.

Users are reminded that during times of system failure or excessive demand, they can still search Wikipedia using Google. The articles may be viewed using Google’s cache.

Retrieved from “https://en.wikinews.org/w/index.php?title=Mysterious_power_failure_takes_down_Wikipedia,_Wikinews&oldid=4550382”
Posted in Uncategorized