Information

Self-Perceived Quality of Life based on operation success chance

Self-Perceived Quality of Life based on operation success chance


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

I heard about this method in a psychology lecture once: let's say that a person has some sort of chronic illness, and they are offered an operation to cure them. Then, the patient is asked about what the minimum chance of success of the operation is, such that they would agree to undergo it. For instance, a patient for whom there is a 99% chance of success of the operation will probably have a higher quality of life while having his/her illness than a patient that only has a 30% chance of success of the operation.

What is this scale of happiness of ill patients called?


On Effects-based Operations, Biologival Evolution, and Some Other Interesting Stuff Lt. Col. Dr. Zoltan Jobbagy - PowerPoint PPT Presentation

Title: No Slide Title Author: Ellie Vrolijk Last modified by: HDI Created Date: 10/16/2003 7:06:10 AM Document presentation format: Diavet t s a k perny re &ndash PowerPoint PPT presentation

PowerShow.com is a leading presentation/slideshow sharing website. Whether your application is business, how-to, education, medicine, school, church, sales, marketing, online training or just for fun, PowerShow.com is a great resource. And, best of all, most of its cool features are free and easy to use.

You can use PowerShow.com to find and download example online PowerPoint ppt presentations on just about any topic you can imagine so you can learn how to improve your own slides and presentations for free. Or use it to find and download high-quality how-to PowerPoint ppt presentations with illustrated or animated slides that will teach you how to do something new, also for free. Or use it to upload your own PowerPoint slides so you can share them with your teachers, class, students, bosses, employees, customers, potential investors or the world. Or use it to create really cool photo slideshows - with 2D and 3D transitions, animation, and your choice of music - that you can share with your Facebook friends or Google+ circles. That's all free as well!

For a small fee you can get the industry's best online privacy or publicly promote your presentations and slide shows with top rankings. But aside from that it's free. We'll even convert your presentations and slide shows into the universal Flash format with all their original multimedia glory, including animation, 2D and 3D transition effects, embedded music or other audio, or even video embedded in slides. All for free. Most of the presentations and slideshows on PowerShow.com are free to view, many are even free to download. (You can choose whether to allow people to download your original PowerPoint presentations and photo slideshows for a fee or free or not at all.) Check out PowerShow.com today - for FREE. There is truly something for everyone!

presentations for free. Or use it to find and download high-quality how-to PowerPoint ppt presentations with illustrated or animated slides that will teach you how to do something new, also for free. Or use it to upload your own PowerPoint slides so you can share them with your teachers, class, students, bosses, employees, customers, potential investors or the world. Or use it to create really cool photo slideshows - with 2D and 3D transitions, animation, and your choice of music - that you can share with your Facebook friends or Google+ circles. That's all free as well!


‘Fear of not delivering’

Climbing to the top of the peloton’s hierarchy is difficult. The fear of falling off that pedestal can be crushing.

Tyler Farrar lived that emotional rollercoaster during his 13-year pro career. After turning professional at 19, he enjoyed the best-ever run by an American sprinter from 2009-2011, winning stages in all three grand tours, and other major races. Then the wins stopped coming. Farrar spent years wondering why he was unable to challenge André Greipel and Mark Cavendish, as he had done before.

“It was hard, but as a pro, you have to be able to make an honest assessment,” Farrar says. “The problem was that I wasn’t getting any slower. My power was the same. Times had changed. The races had changed. Today, there are three climbs in the last 50km. I like flat roads.”

Desperate to chase results, Farrar began taking more risks and crashing more often. It was a high-wire act, and Farrar experienced first-hand the emotional, physical, and mental strain that every pro endures. After a few subpar seasons, he embraced a chance to become a team captain and domestique at the expanding Dimension Data team. Rather than race to win, he raced to help his teammates win.

Farrar was lucky that he could find a new role and extend his career. Riders are constantly squeezed and pressed by the media and sponsors to win, no matter the cost. Often, there’s no support structure for those riders when they lose.

“You’re around people a lot, but it can also be very isolating,” Farrar says. “It’s every man for himself in a lot of ways.”

There is a thin line between injury — or worse — and glory. Straddling that line requires a special mental toolbox. Measuring risk, overcoming apprehension, managing the anxiety that comes with it, and eventually embracing the fear is as essential to becoming a professional cyclist as one’s VO2max.

Facing such pressures, it’s no surprise that professional cyclists often adopt neurotic behavior. Many are prone to extreme emotions. Since cycling is such a mentally taxing profession, it’s surprising how little attention has been paid to that aspect of performance. Many agree more could be done.

“Some riders fear success,” Keim says. “They’re the underdog, and it becomes their identity. Then they do well, and they have to keep it up. They become their own worst enemy. Others have the unique ability to really step up when the pressure is on to perform. Every athlete has their own unique qualities and challenges.”

Professional racing can be a nomadic experience, with pros living out of suitcases and in hotels for months on end. They’re often isolated, and living in a cocooned existence that is unlike any other athletic endeavor. Race, eat, sleep, recover, and then repeat. That’s their day-to-day mantra. Inside that seemingly benign routine, cyclists face real or imagined demons. There’s often a lack of introspection. Results count whining doesn’t.

During the off-season, Farrar would retreat to the mountains of Washington to get away from the chaos of racing. Hunting and exploring were his salve. After retiring from his racing career, despite having one year left on his contract, Farrar began preparing to be a firefighter. He wanted to give “something back” to his community after dedicating most of his life to the self-serving pursuit of racing.

“I didn’t want to be the guy who stuck around too long, and just hung on for dear life in his career,” Farrar says. “Cycling was always my dream, but the one thing I struggled with in being a pro athlete is that it’s a pretty selfish existence.”

Farrar is something of an oddity in the peloton: a pro who retired on his terms with sound body and mind.

Tyler Farrar. Photo: Tim De Waele | Getty Images


Environmental impact:

Due to absence of production process the environmental impact is minimal in nature. The business is based on the application based operation and thus, it does not affect environment. Moreover, the business indirectly influences in minimizing the environmental hazards through minimizing scattered parking and thus, traffic congestion.

Community impact:

Local residents and shop keepers will be benefitted from the online parking allotment. The shopping and resident premises will be parking free and thus, the society development and upgradation will be eased.

The anticipated risk factors associated to the business are:

  • Unavailability of start-up funds
  • Loss of assets
  • Unavailability of ample number of parking slots
  • High rental costs to tie up with the parking owners

Different strategies like proper handling of assets, effective awareness of online parking will be the prime strategy to avoid such uncertainty and its consequences.


Top 30 Innovations Of The Last 30 Years

Imagine this is 1979: If you were reading this article back then, chances are you would have read it on paper--with a printed newspaper or magazine in your hands. Today, you are probably reading it on a desktop computer, a laptop (or as a printout from either of these) or perhaps even on your BlackBerry or iPhone. The pace of innovation has been so hectic in recent years that it is hard to imagine which innovations have had the greatest impact on business and society.

Is it possible to determine which 30 innovations have changed life most dramatically during the past 30 years? That is the question that Nightly Business Report, the Emmy Award-winning PBS business program, and [email protected] set out to answer to celebrate NBR's 30th anniversary this year. NBR partnered with [email protected] to create a list of the "Top 30 Innovations of the Last 30 Years." The show's audiences from more than 250 markets across the country and [email protected]'s readers from around the world were asked to suggest innovations they think have shaped the world in the last three decades.

After receiving some 1,200 suggestions--everything from lithium-ion batteries, LCD screens and eBay to the mute button, GPS and suitcase wheels--a panel of eight judges from Wharton reviewed and selected the top 30 innovations, which were revealed on air and online Feb. 16.

The list is as follows, in order of importance:

1. Internet, broadband, www (browser and html)

5. DNA testing and sequencing/human genome mapping

6. Magnetic Resonance Imaging (MRI)

9. Office software (spreadsheets, word processors)

10. Non-invasive laser/robotic surgery (laparoscopy)

11. Open-source software and services (e.g., Linux, Wikipedia)

13. Liquid crystal display (LCD)

15. Online shopping/e-commerce/auctions (e.g., eBay)

16. Media file compression (jpeg, mpeg, mp3)

18. Photovoltaic solar energy

19. Large- scale wind turbines

20. Social networking via the Internet

21. Graphic user interface (GUI)

22. Digital photography/videography

23. RFID and applications (e.g., EZ Pass)

24. Genetically modified plants

26. Bar codes and scanners

30. Anti-retroviral treatment for AIDS

Before the winners could be selected from the vast number of entries, the Wharton judges first had to define what innovation means in an age dominated by digital technology, medical advancements and mobile communications. The judges included Ian MacMillan, director of the Sol C. Snider Entrepreneurial Research Center Thomas Colligan, vice dean, Wharton Executive Education Kevin Werbach, professor of legal studies and business ethics Karl Ulrich, chair, operations and information management department Franklin Allen, co-director of the Wharton Financial Institutions Center George Day, co-director of the Mack Center for Technological Innovation Lori Rosenkopf, professor of management and Mukul Pandya, editor in chief of [email protected]

"Innovation is a surprisingly hard word to define," says Werbach. "Everyone thinks they know it, but when you ask them to explain exactly what an innovation is, it gets very hard." In order to achieve the best results and narrow down the most authentic list of winners, Werbach and his fellow judges defined innovation as more than simply a new invention. "It's something new that creates new opportunities for growth and development," he says, citing cellular technology, which ranks No. 3 on the list. "We've gone from zero to close to three-and-a-half-billion people who have a mobile device and are connected to each other."

Another qualification the judges used to highlight the most sophisticated, powerful innovations was problem-solving value, says Ulrich. "Almost all product design is, in fact, innovation, but the converse is not true," he adds. "Many successful innovations begin with a user need. Some innovations occur because of some serendipitous event or some scientific discovery. The innovator goes and looks for the user and looks for an application of the technology."

An example in the pharmaceutical industry is the development of new chemical compounds to treat medical conditions, as seen in No. 30 on the list, anti-retroviral treatments for HIV and AIDS. "We don't think of that as a product design," says Ulrich, "but we would think of it as an innovation."

Hardly a surprise, the Internet--combined with broadband, browsers and HTML--was ranked first in a list dominated by technological and medical advancements. MacMillan notes that the Internet is an innovation that created an industry and subsequent new technologies, making it an especially important category. "Some [innovations] are more transient and come and go very quickly," he says. "To me, the ones that really matter are the ones that generate whole new industries."

Colligan credits the technology with improving communications and enhancing the standard of living and working, regardless of one's location. "Technology has leveled the playing field," he says, adding that he is not surprised so many innovations fall under the technology category. "It's brought populations that were in poverty, frankly, up to certainly a better standard of living. It's allowed others to enter the workforce in the new global environment."

The panel of judges applied a specific set of criteria to narrow down the innovations in evolving technological and scientific fields. The innovations were selected based on how they impact quality of life, fulfill a compelling need, solve a problem, exhibit a "wow" factor, change the way business is conducted, increase efficiency, spark new innovations and create a new industry.

Day says the Internet ranked high, along with mobile computing and telecommunications devices, because of the way this collective of innovations connects people, saves time and creates mobile access points for knowledge. "The Internet took away a major constraint to accessing knowledge and sharing knowledge," he says. "But a bigger innovation is one that spawns other innovations."

Almost every aspect of business or social relations today is touched by the Internet and the subsequent industries the platform has created on an international scale. "It's hard to imagine tackling a challenge like bringing clean water and good health care to the largest number of people possible in the developing world without using the Internet and the technologies around it," says Werbach. "It's not just a business phenomenon. It's a central organizing platform for anything you can think of."

Werbach also says laptop computers, ranked No. 2, are related to the Internet, thanks to connectivity in the digital realm. "The computer is not something that is in a specific place (i.e., your office)," he says. "It changes the nature of interaction." And it connects with multiple devices that have been created in the last 30 years, including digital cameras, digital music players and wireless printers.

Innovations in Health Care

Many of the innovations capitalize on existing technology to flourish. In some cases, the results not only demonstrate measured success now among select innovations, but also focus on categories that promise even greater success in the future. Most of the scientific selections--including drug developments, surgical advancements and new diagnostic tools--have the potential to spur greater innovation within the next few years to extend life and cure disease. Within the top 10 alone, DNA testing and sequencing, human genome mapping, Magnetic Resonance Imaging (MRI) and non-invasive laser and robotic surgery (laparoscopy) are included.

"DNA has a huge promise to improve diagnoses," says Day, adding that DNA testing and sequencing ranked at No. 5 because of its ability to enhance the pharmaceutical industry by spawning more effective drugs based on genetic factors that have been impossible to determine without it.

Many innovations on the list also subscribe to a "wow" factor, or characteristics that somehow make the innovation surprising, unusual or unexpected, which becomes more difficult to gauge the longer an innovation is used and the more familiar it becomes. But the wow factor, says Ulrich, is important for two reasons: to grab a user's attention and to erect a barrier between it and the competition.

Colligan says this form of competitive marketing and innovation is very much on the minds of the nation's top executives as a way to enhance business goals in a challenging economy. "Innovation creates new revenue streams," he says. "It's a mindset that needs to be started at the top of the organization to allow people to experiment and try different things. It's the opportunity to break through existing models that not only allow for new innovations, but also challenge executives in organizations that have that type of mindset to attract top talent."

Despite a few of the trends revealed in this listing, innovation is not restricted to consumer products organizations or the health care industry. "[Innovation] happens every day," says Colligan, "when executives are looking for solutions to a problem and consultants and professionals are putting together a team. The challenge that professional service firms have is that when good work is done, how do they replicate that?"

The current economic climate weighs heavily on the importance of these 30 innovations, especially as new technology is being used to preserve, and in some cases, revive the commercial landscape. "The innovations are in stark contrast to what we're going through in the economy right now," says Allen. "The innovations also point toward peoples' expectations about the future in the way they change the world." In this category belong the innovations in energy--such as photovoltaic solar energy, which clocked in at No. 18, and large-scale wind turbines (No. 19).

Allen compares these innovations to important strides in the early to mid-20th century, like antibiotics, aspirin, automobiles and improvements in radio technology. "Some things are really fundamental in the way they change the world," he says. "One would hope these 30 innovations would be just as important 30 years or more from now."


Chapter 2

The opioid epidemic took hold in the U.S. in the 1990s. Percocet, OxyContin and Opana became commonplace wherever chronic pain met a chronic lack of access to quality health care, especially in Appalachia.

The Centers for Disease Control and Prevention calls the prescription opioid epidemic the worst of its kind in U.S. history. “The bottom line is this is one of the very few health problems in this country that’s getting worse,” said Dr. Tom Frieden, director of the CDC.

U.S. Heroin Use By Year

“We had a fourfold increase in deaths from opiates in a decade,” Frieden said. “That’s nearly 17,000 people dying from prescription opiate overdoses every year. And more than 400,000 go to an emergency room for that reason.”

Clinics that dispensed painkillers proliferated with only the loosest of safeguards, until a recent coordinated federal-state crackdown crushed many of the so-called “pill mills.” As the opioid pain meds became scarce, a cheaper opioid began to take over the market — heroin. Frieden said three quarters of heroin users started with pills.

Federal and Kentucky officials told The Huffington Post that they knew the move against prescription drugs would have consequences. &ldquoWe always were concerned about heroin,” said Kevin Sabet, a former senior drug policy official in the Obama administration. “We were always cognizant of the push-down, pop-up problem. But we weren’t about to let these pill mills flourish in the name of worrying about something that hadn’t happened yet. … When crooks are putting on white coats and handing out pills like candy, how could we expect a responsible administration not to act?&rdquo

As heroin use rose, so did overdose deaths. The statistics are overwhelming. In a study released this past fall examining 28 states, the CDC found that heroin deaths doubled between 2010 and 2012. The CDC reported recently that heroin-related overdose deaths jumped 39 percent nationwide between 2012 and 2013, surging to 8,257. In the past decade, Arizona’s heroin deaths rose by more than 90 percent. New York City had 420 heroin overdose deaths in 2013 — the most in a decade. A year ago, Vermont’s governor devoted his entire State of the State speech to heroin’s resurgence. The public began paying attention the following month, when Philip Seymour Hoffman died from an overdose of heroin and other drugs. His death followed that of actor Cory Monteith, who died of an overdose in July 2013 shortly after a 30-day stay at an abstinence-based treatment center.

In Cincinnati, an entry point for heroin heading to Kentucky, the street dealers beckoning from corners call it “dog” or “pup” or “dog food.” Sometimes they advertise their product by barking at you. Ohio recorded 680 heroin overdose deaths in 2012, up 60 percent over the previous year, with one public health advocate telling a local newspaper that Cincinnati and its suburbs suffered a fatal overdose every other day. Just over the Ohio River the picture is just as bleak. Between 2011 and 2012, heroin deaths increased by 550 percent in Kentucky and have continued to climb steadily. This past December alone, five emergency rooms in Northern Kentucky saved 123 heroin-overdose patients those ERs saw at least 745 such cases in 2014, 200 more than the previous year.

For addicts, cravings override all normal rules of behavior. In interviews throughout Northern Kentucky, addicts and their families described the insanity that takes hold. Some addicts shared stories of shooting up behind the wheel while driving down Interstate 75 out of Cincinnati, or pulling over at an early exit, a Kroger parking lot. A mother lamented her stolen heirloom jewelry and the dismantling of the family cabin piece by piece until every inch had been sold off. Addicts stripped so many houses, barns, and churches of copper and fixtures in one Kentucky county that the sheriff formed a task force. Another overdosed on the couch, and his parents thought maybe they should just let him go.

Northern Kentucky Hit Hard By Heroin Overdoses

Between 2011 and 2014, heroin overdoses at five Kentucky emergency rooms outside of Cincinnati &mdash Covington, Ft. Thomas, Edgewood, Florence and Grant County &mdash increased by 669 percent.

Chemistry, not moral failing, accounts for the brain’s unwinding. In the laboratories that study drug addiction, researchers have found that the brain becomes conditioned by the repeated dopamine rush caused by heroin. “The brain is not designed to handle it,” said Dr. Ruben Baler, a scientist with the National Institute on Drug Abuse. “It’s an engineering problem.”

Dr. Mary Jeanne Kreek has been studying the brains of people with addiction for 50 years. In the 1960s, she was one of three scientists who determined that methadone could be a successful maintenance treatment for an opioid addicted person. Over the years, various drug czars from both political parties have consulted her at Rockefeller University in New York City, where she is a professor and head of the Laboratory of the Biology of Addictive Diseases. According to Kreek, there’s no controversy over how opiate addiction acts upon the brain.

“It alters multiple regions in the brain,” Kreek said, “including those that regulate reward, memory and learning, stress responsivity, and hormonal response, as well as executive function which is involved in decision-making — simply put, when to say yes and when to say no.”

A heroin addict entering a rehab facility presents as severe a case as a would-be suicide entering a psych ward. The addiction involves genetic predisposition, corrupted brain chemistry, entrenched environmental factors and any number of potential mental-health disorders — it requires urgent medical intervention. According to the medical establishment, medication coupled with counseling is the most effective form of treatment for opioid addiction. Standard treatment in the United States, however, emphasizes willpower over chemistry.

To enter the drug treatment system, such as it is, requires a leap of faith. The system operates largely unmoved by the findings of medical science. Peer-reviewed data and evidence-based practices do not govern how rehabilitation facilities work. There are very few reassuring medical degrees adorning their walls. Opiates, cocaine and alcohol each affect the brain in different ways, yet drug treatment facilities generally do not distinguish between the addictions. In their one-size-fits-all approach, heroin addicts are treated like any other addicts. And with roughly 90 percent of facilities grounded in the principle of abstinence, that means heroin addicts are systematically denied access to Suboxone and other synthetic opioids.

On average, private residential treatment costs roughly $31,500 for 30 days. Addicts experience a hodgepodge of drill-instructor tough love, and self-help lectures, and dull nights in front of a television. Rules intended to instill discipline govern all aspects of their lives, down to when they can see their loved ones and how their bed must be made every morning. A program can seem both excessively rigid and wildly disorganized.

After a few weeks in a program, opiate addicts may glow as if born again and testify to a newfound clarity. But those feelings of power and self-esteem can be tethered to the rehabilitation facility. Confidence often dims soon after graduation, when they once again face real life with a still-warped brain hypersensitive to triggers that will push them to use again. Cues such as a certain smell associated with the drug or hearing the war stories of other addicts could prompt a relapse.

“The brain changes, and it doesn’t recover when you just stop the drug because the brain has been actually changed,” Kreek explained. “The brain may get OK with time in some persons. But it’s hard to find a person who has completely normal brain function after a long cycle of opiate addiction, not without specific medication treatment.”

An abstinence-only treatment that may have a higher success rate for alcoholics simply fails opiate addicts. “It’s time for everyone to wake up and accept that abstinence-based treatment only works in under 10 percent of opiate addicts,&rdquo Kreek said. &ldquoAll proper prospective studies have shown that more than 90 percent of opiate addicts in abstinence-based treatment return to opiate abuse within one year.&rdquo In her ideal world, doctors would consult with patients and monitor progress to determine whether Suboxone, methadone or some other medical approach stood the best chance of success.

A 2012 study conducted by the National Center on Addiction and Substance Abuse at Columbia University concluded that the U.S. treatment system is in need of a “significant overhaul” and questioned whether the country’s “low levels of care that addiction patients usually do receive constitutes a form of medical malpractice.”

While medical schools in the U.S. mostly ignore addictive diseases, the majority of front-line treatment workers, the study found, are low-skilled and poorly trained, incapable of providing the bare minimum of medical care. These same workers also tend to be opposed to overhauling the system. As the study pointed out, they remain loyal to “intervention techniques that employ confrontation and coercion — techniques that contradict evidence-based practice.” Those with &ldquoa strong 12-step orientation&rdquo tended to hold research-supported approaches in low regard.

Researchers have been making breakthroughs in addiction medicine for decades. But attempts to integrate science into treatment policy have been repeatedly stymied by scaremongering politics. In the early 1970s, the Nixon administration promoted methadone maintenance to head off what was seen as a brewing public health crisis. Due to fears of methadone’s misuse, however, regulations limited its distribution to specialized clinics, and it became a niche treatment. Methadone clinics have since become the targets of NIMBYs and politicians who view them as nothing more than nuisance properties. In the late 󈨞s, then-New York City Mayor Rudy Giuliani tried unsuccessfully to cut methadone programs serving 2,000 addicts on the grounds that despite the medication’s success as a treatment, it was an immoral solution and had failed to get the addicts employed.

A new medication developed in the 1970s, buprenorphine, was viewed as a safer alternative to methadone because it had a lower overdose risk. &ldquoBupe,&rdquo as it’s become known, was originally approved for pain relief, but knowledgeable addicts began using it as a black market route to drug rehabilitation. Government approval had to catch up to what these addicts had already field tested. After buprenorphine became an accepted treatment in France in the mid-󈨞s, other countries began to treat heroin addicts with the medication. Where buprenorphine has been adopted as part of public policy, it has dramatically lowered overdose death rates and improved heroin addicts’ chances of staying clean.

In 2002, the U.S. Food and Drug Administration approved both buprenorphine (Subutex) and buprenorphine-naloxone (Suboxone) for the treatment of opiate dependence. Suboxone combines bupe with naloxone, the drug that paramedics use to revive overdose victims. These medications are what’s called partial agonists which means they have a ceiling on how much effect they can deliver, so extra doses will not make the addict feel any different.

Whereas generic buprenorphine can produce a high if injected, Suboxone was formulated to be more difficult to manipulate. If an addict uses it improperly by injecting it, the naloxone kicks in and can send the person into withdrawal — the opposite of a good time.

In the U.S., the more abuse-resistant Suboxone dominates the market, making it the most widely prescribed of the medically assisted treatments for opioid addiction.

Neither Suboxone nor methadone is a miracle cure. They buy addicts time to fix their lives, seek out counseling and allow their brains to heal. Doctors recommend tapering off the medication only with the greatest of caution. The process can take years given that addiction is a chronic disease and effective therapy can be a long, grueling affair. Doctors and researchers often compare addiction from a medical perspective to diabetes. The medication that addicts are prescribed is comparable to the insulin a diabetic needs to live.

“If somebody has a heroin dependence and they did not have the possibility to be offered methadone or Suboxone, then I think it’s a fairly tall order to try and get any success,” said Dr. Bankole Johnson, professor and chair of the Department of Psychiatry at the University of Maryland School of Medicine. “There have been so many papers on this — the impact of methadone and Suboxone. It’s not even controversial. It’s just a fact that this is the best way to wean people off an opioid addiction. It’s the standard of care.”

&ldquoThere have been so many papers on this &mdash the impact of methadone and Suboxone. It’s not even controversial.&rdquo

But as the National Center on Addiction and Substance Abuse study pointed out, treatment as a whole hasn’t changed significantly. Dr. A. Thomas McLellan, the co-founder of the Treatment Research Institute, echoed that point. “Here’s the problem,” he said. Treatment methods were determined “before anybody really understood the science of addiction. We started off with the wrong model.&rdquo

For families, the result can be frustrating and an expensive failure. McLellan, who served as deputy director of the White House’s Office of National Drug Control Policy from 2009 to 2011, recalled recently talking to a despairing parent with an opiate-addicted son. The son had been through five residential treatment stays, costing the family more than $150,000. When McLellan mentioned buprenorphine, the father said he had never heard of it.

Most treatment programs haven’t accepted medically assisted treatments such as Suboxone because of “myths and misinformation,” said Robert Lubran, the director of the pharmacological therapy division at the federal Substance Abuse and Mental Health Services Administration.

In fiscal year 2014, SAMHSA, which helps to fund drug treatment throughout the country, had a budget of roughly $3.4 billion dedicated to a broad range of behavioral health treatment services, programs and grants. Lubran said he didn’t believe any of that money went to programs specifically aimed at treating opioid-use disorders with Suboxone and methadone. It’s up to the states to use block grants as they see fit, he said.

Kentucky has approached Suboxone in such a shuffling and half-hearted way that just 62 or so opiate addicts treated in 2013 in all of the state’s taxpayer-funded facilities were able to obtain the medication that doctors say is the surest way to save their lives. Last year that number fell to 38, as overdose deaths continued to soar.

Waiver Required

Federal waivers are required for doctors to prescribe buprenorphine products like Suboxone. Kentucky has 518 doctors with such waivers, most clustered around cities like Louisville and Lexington.

In multiple states struggling to manage the epidemic, thousands of addicts have no access to Suboxone. There have been reports by doctors and clinics of waiting lists for the medication in Kentucky, Ohio, central New York and Vermont, among others. In one Ohio county, a clinic’s waiting list ran to more than 500 patients. Few doctors choose to get certified to dispense the medication, and those who do work under rigid federal caps on how many patients they can treat. Some opt not to treat addicts at all. According to state data, more than 470 doctors are certified in Kentucky, but just 18 percent of them fill out 80 percent of all Suboxone prescriptions.

There’s no single explanation for why addiction treatment is mired in a kind of scientific dark age, why addicts are denied the help that modern medicine can offer. Family doctors tend to see addicts as a nuisance or a liability and don’t want them crowding their waiting rooms. In American culture, self-help runs deep. Heroin addiction isn’t only a disease – it’s a crime. Addicts are lucky to get what they get.


11 Answers 11

I think the golden rule here is "Everything in moderation".

If you are fairly certain a piece of code is going to prove to be a bottleneck, its not a horrible practice to do some initial optimization. At the very least, its a good idea to take steps to make sure it will be easy to refactor later.

What you want to avoid is going overboard by sacrificing time and readability in the name of micro-optimizations before you've seen actual data to justify such an effort.

The line between "optimizing" and just "sensible design" is sometimes fairly fine, but other times pretty obvious. Just for example, you don't need a profiler to be pretty sure that if you're sorting a few million items, it's worth using an O(N log N) algorithm rather than an O(N 2 ) algorithm. IMO, that just falls under being reasonable sensible, not optimization though.

There are also some things you might as well do, simply because they might provide a benefit, and the cost is minimal to nonexistent. To use @unholysampler's example, writing ++i instead of i++ may have some minuscule cost if you're accustomed to typing it as a post-increment, but (at most) it's temporary and trivial. I wouldn't spend any time rewriting working code for the sake of possibly saving a nanosecond, unless the profiler had shown that I really needed that time, and stood a reasonable chance of saving there. At the same time, when I'm just typing in new code, I'd work at habitually using the form that's likely to be better, because once you do so habitually it's free. It frequently won't gain anything, and even when it makes a difference it often won't be large enough to notice or care about -- but it's still free, so there's not reason not to do it.

Cases like those are fairly obvious, and in my mind would really fall under sensible design rather than what I'd think of as truly optimization. Most other cases of "optimization without representation" would be considerably harder to justify though. If you're going to spend any significant time or effort on the optimization, you should have something much more solid than a "gut feel" to justify it.

I should add that part of the reason I say that is that I think profiling code is extremely useful, even when your goal isn't optimization. A profiler gives a high-level overview of a piece of code that can be extremely useful even when you don't particularly care about optimization at all. Just for example, if you see 10x as many calls to allocate a resource as to free that type of resource, it's a pretty good clue that there's a real problem, even if the code currently seems to run just fine. When you get down to it, a lot of code has a lot of things that should match up (not necessarily 1:1, but somehow or other) and a profiler can show mismatches like that much more quickly than most other tools.

The point of profiling before optimizing is that you need a baseline to determine how much improvement the optimization gave you. The "surprising" information I get from my profiler will be along these lines:

  • I knew that method was called a lot, but I didn't realize it was called that much.
  • Thread monitor contention almost never slows things down where I expect it.
  • I'm generating that many instances of XYZ? It should only be about n .

That said, many times the profiler merely confirms my suspicions. Good scientific method involves healthy doses of monitoring and experimentation. It's a bit difficult, but I do try to figure out more systemic problems with the profiler. I could do the obvious change and get a 5% improvement, or if I approached the problem differently I might be able to get a 25% improvement. (Most optimizations don't yield such a large improvement, but on occasion they do.) Of course, I wouldn't know how much my optimization improved the performance without a baseline measurement.

"I can sense are going to be bottlenecks" (sic)

The problem with this statement is observer error. Just because you think it may be bad doesn't mean it IS bad. The profiler will give empirical evidence and keep you from spending time in an area that may give no improvement. That's why you should start with the profiler before optimizing. Let the program and profiler PROVE that it is a slow section first.

There are known performance killers and known best practices which avoid them. I don't think you need to profile to determine a cursor is slower in SQL Server than a set-based operation. If you start out knowing this, you will write code that performs better from the start without the need to profile the two options every time you write code.

If you are ajusting existing code, it is better to profile not only so you can be sure that the code you know is inefficient in general is working badly, but it can also show up problems you didn't know you had. I remember profiling one thing where I suspected the stored proc could be optimized (it could) and found out that it being called hundreds of times by the application when it only needed to be called it once.

Additionally, you have the benefit of being able to prove that you did in fact improve performance when you profile. I personally save these before and after figures and use them in my performance appraisal write-ups and in discussing my achievements when job hunting. It's nice to have actual figures to show how good you are at tuning.

The reason people say that is because you won't know what needs to be optimized before your profiling is finished, so you could end up wasting a lot of time for no benefits.

Still, if something leaps out at you as a bad idea (e.g. your coworker chooses to use deep recursion instead of a simple iterative loop) you can fix it. But you should be focused on moving forward, not naval gazing on old code. There is a stage in the process where that is appropriate.

I think everyone tries to write good code from the beginning. That sounds like what you're doing.

What I think should be done in the beginning is to just keep the design simple, especially the data structure. Often people start off assuming they need more sophisticated data structure, redundant data, and detailed notification techniques because they are worried about performance. In my experience, those things cause the problem they are supposed to avoid.

In spite of good coding practice and good design, performance problems creep in, and you need to remove them periodically. These are almost never things you could have guessed, and most profilers are not very good at finding them either. What's more, the optimization level of the compiler seldom has any effect on them, because mostly they are not tight compute-bound loops. Mostly they present as innocent-looking (or even invisible) function calls that, if you randomly snapshot the stack, are in the middle of it, and are consuming way more wall-clock time than you ever would have imagined, as shown by how often they appear there.

There are two kind of optimisation that can (relatively) safely be done before profiling.

Algorithmic optimisation: choosing an algorithm with better average (or worst-case) complexity. This can even (should?) be done before beginning coding. You'll still have to check that the selected algorithm is the correct one given your real data set, but it is a good idea to start with an algorithm that is expected to fare better, isn't it?

Data structure optimisation: laying out your data correctly, or using a data structure with better locality can increase your performance, but it will have an impact on the algorithms that can be used, so it is easier to do such an optimisation before coding (and thus you cannot use a profiler if there is no code). For example, when programming a video game, it is generally better to use struct-of-array (SoA) instead of array of struct (AoS) to store data as it will benefit from data locality, cache coherency, .

Yes it wrong to optimize before profiling BUT

Good programming, programming that makes code simpler and more straightforward doesn't require profiling. Good programming like moving unneeded initializations out of loops does not need anymore justification than the fact that you are improving the quality of the code.

IMHO the only times you need to profile is when you are specifically looking to improve performance. In order to show the improvement you must baseline first, and then show the delta.

Anytime you add complexity under the guise of optimization without showing proof that it was a bottleneck and the code does improve performance is just doing bad programming.

The critical distinction is between:

The optimized code is as simple or simpler than the un-optimized.

The optimized code is more complex (and therefore more error prone and harder to modify in the future) than the un-optimized code.

In the first case, sure go ahead. In the second case you have to weigh the investment in development time (including opportunity cost on not using the same time to fix bugs or deliver features) and the future higher cost of maintenance for a more complex solution. You have to weigh this cost against the observable improvements to performance. How will you perform this judgement if you have no idea what the performance cost is? A function might be obviously inefficient, but if it only takes a few milliseconds anyway, a 1000x performance optimization will not provide any value. Not to mention the opportunity cost of not working on an optimization where it actually matters.

Second, your intuition about performance might very well be wrong - and you will never know if you "optimize" before measuring. For example many developers tend to think that say a O(log n) algorithm is faster than a O(n). But you don't know that. The O(n) algorithm might be faster as long as n is below some threshold. What is that threshold? You probably don't know. And what is n actually in your particular program? Is it usually above or below this threshold? How will you find out?

What if you intuition is wrong and you optimization actually makes the program run slower? If you profiled before and after, you realize your mistake and roll the changes back. At worst you wasted some time. If you don't profile you are none the wiser and have caused long term damage to the program.

The really difficult decision is when you can go down different roads in the architecture. Should the network communication be JSON over HTTP(simple) or protocol buffers over TCP (performant)? The problem here is you have to decide up front, before you can even measure performance, if you don't want to waste work by having to change protocol later. In this case you cannot just start out with the simple version and then optimize later when it turns out to be a problem. But you shouldn't just choose the most performant version by default either. You will have to do some educated guesses and projections.

I note you state that profiling "as often as not" gives the same result as you intuition based on your understanding of the program. I take that to mean that you have a 50% success rate in predicting the best way to allocate resources for optimizing. Given the short and long term cost of misapplied optimization, that is not really a very good rate to rely on.


Social Darwinism in Western History and Politics

It is crucial to note the implicit and overt role that Social Darwinism continues to play in the imperialist appetites of Westernized powers, which are led by the U.S. in its global mission of conquering, which is cleverly euphemized as conducting a “War on Terrorism”. U.S. involvement in places such as the Middle East has traditionally been based on its lucrative natural resources. In the 1970’s America turned towards the Middle East due to gas shortages and accordant high prices it embarked on Operation Desert Storm in the final decade of the century in order to lower gas prices and gain better access to oil.

Social Darwinism can justify both of these occurrences, and has been so implicitly and subtly ingrained in Westernized culture that it is now simply an afterthought to any aggressive military-based operation. The rationale for U.S. involvement in these aforementioned incidents is quite simple—the country views these natural resources as vital to its survival, and therefore believes that it has the military and economic strength to force those in control of such resources in foreign lands to comply with American wishes. When they are slow to do so or less than cooperative in the terms of their compliance, military force ensures which may result in a lasting occupancy, as more recent events in Iraq and Afghanistan seemingly indicate. The relevance of Social Darwinism in these forms of imperialism is easily discerned in the subsequent quotation.

Imperialism

The application of Darwin’s biological concepts to the social world…buttressed imperialism, racism, nationalism, and militarism—doctrines that preached relentless conflict. Social Darwinists insisted that nations and races were engaged in a struggle for survival in which only the fittest survive and deserve to survive (Perry).

Thus, America certainly feels justified in demanding resources and taking military action that is necessary to set up economic and traditional methods of imperialism as simply a means of survival. Its current “War on Terror” provides an excellent example of this fact. The U.S. has occupied and enforced its own form of government in both Afghanistan and Iraq in events that can be traced to the destruction of the World Trade Center. Doing so is simply contemporary imperialism in the traditional sense in which countries appropriated foreign territory as their own. Afghanistan was reportedly linked to the origin of the World Trade Center destruction as mandated by Osama Bin Laden. That fact alone does not warrant America’s continuous presence there to this day.

The example of Iraq is even more revealing of the imperialist desire of the U.S. The latter invaded the former reputedly because Iraq possessed weapons of mass destruction—which were never found, if they were even looked for at all. Instead, the previous leader of the country’s government, Saddam Hussein, was unseated from his position of power and killed, while the U.S. implemented its own democratic form of government. This aspect of imperialism directly relates to Social Darwinism. What this theory presupposes is a difference in quality—with the practice and methods of Westernized culture being inherently superior—is actually just a difference in type. However, it is Westerners who ascribe the superiority of their culture, a tendency that Darwin himself exhibits in the following quotation in which he describes indigenous foreigners. “These men were absolutely naked and bedaubed with paint…they possessed hardly any arts, and, like wild animals, lived on what they could catch they had no government, and were merciless to everyone not of their small tribe” (Burton and Dworkin).

This quotation expresses cultural and sociological differences. However, Darwin, in his Westernized ideal perspective, ascribes a value judgment based on these differences, in which these foreigners are less developed (in the evolutionary scale), less intelligent and less capable of self-determination than Europeans. This same line of thinking continues among Social Darwinians today, and certainly applies to the spread of imperialism through the War on Terror (in Iraq in particular) and justifies the use of force for material gains. Social Darwinism has been thoroughly ingrained in Westernized culture, and readily facilitates any imperialist actions on the part of Westerners.


8 Ways To Be More Proactive In Life (+ Examples)

Consult a life coach to help you be more proactive in your life. Simply click here to find one now.

Proactive (adj): taking action by causing change and not only reacting to change when it happens.

We’re often told that being proactive is the best approach to life.

That we should take the bull by the horns and in that way rise above mediocrity to a new level of success, both in our career and in our personal life.

If you’ve ever read Stephen Covey’s influential bestseller 7 Habits of Highly Effective People, you’ll know that the first ‘habit’ is Be Proactive, Not Reactive.

It’s interesting that the concept of proactivity has given rise to another buzzword of our age: empowerment.

This makes complete sense because it’s impossible to feel empowered if you’re merely reacting to events.

You need to be firmly in the driving seat to be sure that you have the power to influence your life.

There are three types of people in this world. Firstly, there are people who make things happen. Then there are people who watch things happen. Lastly, there are people who ask what happened? Which do you want to be?

Clearly, it’s the first kind of person who displays proactive behavior.

And the very fact that you’ve clicked on this article indicates that you’re keen to learn more about this potentially life-changing quality.

You’d like to be someone who makes things happen.

And why not? Being proactive is undoubtedly an attractive quality to have.

Let’s face it, if you think about the people you admire the most, chances are it’s not those who react to change when it happens, or those who just roll with the punches while wondering what happened…

…it’s those who take control and actively get stuff done who stand out.

How about becoming one of those people?

With a bit of helpful guidance, it’s not that difficult to switch your mindset.

Rather than being a passive person who takes whatever life throws at you, you can become an active participant in the ups and downs, potentially with the power to control and direct them in a way that’s favorable to you.

But what if that’s not your pre-programmed personality type?

What if you think you lack either the ideas or the initiative to be that proactive person who is so sought after, especially in the world of business?

And what if you feel that your life outside of work could do with a little more proactivity?

What if your default setting is passive acceptance of the status quo, merely reacting to stimuli when needs must?

Well, if you’re now ready to break out of the cycle of being a passive receiver, there is good news: proactivity isn’t some mysterious gift that we either possess or not.

Everyone has the potential to be the kind of person who makes things happen.

It’s actually a habitual mindset we can develop and strengthen over time.

But first, let’s clarify something…


Introduction

The world in which we currently live confronts people responsible for making decisions about security with very challenging issues. These issues call for sophisticated logical and statistical analysis, detection and forecasting systems, cost-benefit analysis, and the like. However, the crux of security is the necessity of dealing with the prospect of potential danger. Because potential dangers have had very substantial consequences for reproductive fitness for many thousands of years, evolution has shaped brain systems specially adapted for managing them. Thus, in addition to the logical armamentarium that present-day decision-makers bring to issues of security, they inevitably bring the intuitions and motivations that are generated by a biologically ancient, “hard-wired” system.

This potential-threat system in the brain has been termed the defense system (Trower et al., 1990) and the hazard-precaution system (Boyer and Lienard, 2006). In our own work, we have called it the security motivation system (Szechtman and Woody, 2004). Our research investigating this system has focused on its role in everyday circumstances, such as behavior to manage threats of contagion due to dirt and germs, and in pathological variants of these behaviors, such as the compulsive hand-washing seen in obsessive-compulsive disorder (OCD). However, it is likely that the influence of the security motivation system extends well beyond such relatively mundane circumstances. The purpose of this perspective article is to explain briefly what we know about the security motivation system and to advance the following question: Does this biological system affect policy-making about security in important ways? We hope to stimulate the thinking of researchers who investigate security-related decision-making, in particular by sketching some of the kinds of hypotheses that could be examined in such research.


Chapter 2

The opioid epidemic took hold in the U.S. in the 1990s. Percocet, OxyContin and Opana became commonplace wherever chronic pain met a chronic lack of access to quality health care, especially in Appalachia.

The Centers for Disease Control and Prevention calls the prescription opioid epidemic the worst of its kind in U.S. history. “The bottom line is this is one of the very few health problems in this country that’s getting worse,” said Dr. Tom Frieden, director of the CDC.

U.S. Heroin Use By Year

“We had a fourfold increase in deaths from opiates in a decade,” Frieden said. “That’s nearly 17,000 people dying from prescription opiate overdoses every year. And more than 400,000 go to an emergency room for that reason.”

Clinics that dispensed painkillers proliferated with only the loosest of safeguards, until a recent coordinated federal-state crackdown crushed many of the so-called “pill mills.” As the opioid pain meds became scarce, a cheaper opioid began to take over the market — heroin. Frieden said three quarters of heroin users started with pills.

Federal and Kentucky officials told The Huffington Post that they knew the move against prescription drugs would have consequences. &ldquoWe always were concerned about heroin,” said Kevin Sabet, a former senior drug policy official in the Obama administration. “We were always cognizant of the push-down, pop-up problem. But we weren’t about to let these pill mills flourish in the name of worrying about something that hadn’t happened yet. … When crooks are putting on white coats and handing out pills like candy, how could we expect a responsible administration not to act?&rdquo

As heroin use rose, so did overdose deaths. The statistics are overwhelming. In a study released this past fall examining 28 states, the CDC found that heroin deaths doubled between 2010 and 2012. The CDC reported recently that heroin-related overdose deaths jumped 39 percent nationwide between 2012 and 2013, surging to 8,257. In the past decade, Arizona’s heroin deaths rose by more than 90 percent. New York City had 420 heroin overdose deaths in 2013 — the most in a decade. A year ago, Vermont’s governor devoted his entire State of the State speech to heroin’s resurgence. The public began paying attention the following month, when Philip Seymour Hoffman died from an overdose of heroin and other drugs. His death followed that of actor Cory Monteith, who died of an overdose in July 2013 shortly after a 30-day stay at an abstinence-based treatment center.

In Cincinnati, an entry point for heroin heading to Kentucky, the street dealers beckoning from corners call it “dog” or “pup” or “dog food.” Sometimes they advertise their product by barking at you. Ohio recorded 680 heroin overdose deaths in 2012, up 60 percent over the previous year, with one public health advocate telling a local newspaper that Cincinnati and its suburbs suffered a fatal overdose every other day. Just over the Ohio River the picture is just as bleak. Between 2011 and 2012, heroin deaths increased by 550 percent in Kentucky and have continued to climb steadily. This past December alone, five emergency rooms in Northern Kentucky saved 123 heroin-overdose patients those ERs saw at least 745 such cases in 2014, 200 more than the previous year.

For addicts, cravings override all normal rules of behavior. In interviews throughout Northern Kentucky, addicts and their families described the insanity that takes hold. Some addicts shared stories of shooting up behind the wheel while driving down Interstate 75 out of Cincinnati, or pulling over at an early exit, a Kroger parking lot. A mother lamented her stolen heirloom jewelry and the dismantling of the family cabin piece by piece until every inch had been sold off. Addicts stripped so many houses, barns, and churches of copper and fixtures in one Kentucky county that the sheriff formed a task force. Another overdosed on the couch, and his parents thought maybe they should just let him go.

Northern Kentucky Hit Hard By Heroin Overdoses

Between 2011 and 2014, heroin overdoses at five Kentucky emergency rooms outside of Cincinnati &mdash Covington, Ft. Thomas, Edgewood, Florence and Grant County &mdash increased by 669 percent.

Chemistry, not moral failing, accounts for the brain’s unwinding. In the laboratories that study drug addiction, researchers have found that the brain becomes conditioned by the repeated dopamine rush caused by heroin. “The brain is not designed to handle it,” said Dr. Ruben Baler, a scientist with the National Institute on Drug Abuse. “It’s an engineering problem.”

Dr. Mary Jeanne Kreek has been studying the brains of people with addiction for 50 years. In the 1960s, she was one of three scientists who determined that methadone could be a successful maintenance treatment for an opioid addicted person. Over the years, various drug czars from both political parties have consulted her at Rockefeller University in New York City, where she is a professor and head of the Laboratory of the Biology of Addictive Diseases. According to Kreek, there’s no controversy over how opiate addiction acts upon the brain.

“It alters multiple regions in the brain,” Kreek said, “including those that regulate reward, memory and learning, stress responsivity, and hormonal response, as well as executive function which is involved in decision-making — simply put, when to say yes and when to say no.”

A heroin addict entering a rehab facility presents as severe a case as a would-be suicide entering a psych ward. The addiction involves genetic predisposition, corrupted brain chemistry, entrenched environmental factors and any number of potential mental-health disorders — it requires urgent medical intervention. According to the medical establishment, medication coupled with counseling is the most effective form of treatment for opioid addiction. Standard treatment in the United States, however, emphasizes willpower over chemistry.

To enter the drug treatment system, such as it is, requires a leap of faith. The system operates largely unmoved by the findings of medical science. Peer-reviewed data and evidence-based practices do not govern how rehabilitation facilities work. There are very few reassuring medical degrees adorning their walls. Opiates, cocaine and alcohol each affect the brain in different ways, yet drug treatment facilities generally do not distinguish between the addictions. In their one-size-fits-all approach, heroin addicts are treated like any other addicts. And with roughly 90 percent of facilities grounded in the principle of abstinence, that means heroin addicts are systematically denied access to Suboxone and other synthetic opioids.

On average, private residential treatment costs roughly $31,500 for 30 days. Addicts experience a hodgepodge of drill-instructor tough love, and self-help lectures, and dull nights in front of a television. Rules intended to instill discipline govern all aspects of their lives, down to when they can see their loved ones and how their bed must be made every morning. A program can seem both excessively rigid and wildly disorganized.

After a few weeks in a program, opiate addicts may glow as if born again and testify to a newfound clarity. But those feelings of power and self-esteem can be tethered to the rehabilitation facility. Confidence often dims soon after graduation, when they once again face real life with a still-warped brain hypersensitive to triggers that will push them to use again. Cues such as a certain smell associated with the drug or hearing the war stories of other addicts could prompt a relapse.

“The brain changes, and it doesn’t recover when you just stop the drug because the brain has been actually changed,” Kreek explained. “The brain may get OK with time in some persons. But it’s hard to find a person who has completely normal brain function after a long cycle of opiate addiction, not without specific medication treatment.”

An abstinence-only treatment that may have a higher success rate for alcoholics simply fails opiate addicts. “It’s time for everyone to wake up and accept that abstinence-based treatment only works in under 10 percent of opiate addicts,&rdquo Kreek said. &ldquoAll proper prospective studies have shown that more than 90 percent of opiate addicts in abstinence-based treatment return to opiate abuse within one year.&rdquo In her ideal world, doctors would consult with patients and monitor progress to determine whether Suboxone, methadone or some other medical approach stood the best chance of success.

A 2012 study conducted by the National Center on Addiction and Substance Abuse at Columbia University concluded that the U.S. treatment system is in need of a “significant overhaul” and questioned whether the country’s “low levels of care that addiction patients usually do receive constitutes a form of medical malpractice.”

While medical schools in the U.S. mostly ignore addictive diseases, the majority of front-line treatment workers, the study found, are low-skilled and poorly trained, incapable of providing the bare minimum of medical care. These same workers also tend to be opposed to overhauling the system. As the study pointed out, they remain loyal to “intervention techniques that employ confrontation and coercion — techniques that contradict evidence-based practice.” Those with &ldquoa strong 12-step orientation&rdquo tended to hold research-supported approaches in low regard.

Researchers have been making breakthroughs in addiction medicine for decades. But attempts to integrate science into treatment policy have been repeatedly stymied by scaremongering politics. In the early 1970s, the Nixon administration promoted methadone maintenance to head off what was seen as a brewing public health crisis. Due to fears of methadone’s misuse, however, regulations limited its distribution to specialized clinics, and it became a niche treatment. Methadone clinics have since become the targets of NIMBYs and politicians who view them as nothing more than nuisance properties. In the late 󈨞s, then-New York City Mayor Rudy Giuliani tried unsuccessfully to cut methadone programs serving 2,000 addicts on the grounds that despite the medication’s success as a treatment, it was an immoral solution and had failed to get the addicts employed.

A new medication developed in the 1970s, buprenorphine, was viewed as a safer alternative to methadone because it had a lower overdose risk. &ldquoBupe,&rdquo as it’s become known, was originally approved for pain relief, but knowledgeable addicts began using it as a black market route to drug rehabilitation. Government approval had to catch up to what these addicts had already field tested. After buprenorphine became an accepted treatment in France in the mid-󈨞s, other countries began to treat heroin addicts with the medication. Where buprenorphine has been adopted as part of public policy, it has dramatically lowered overdose death rates and improved heroin addicts’ chances of staying clean.

In 2002, the U.S. Food and Drug Administration approved both buprenorphine (Subutex) and buprenorphine-naloxone (Suboxone) for the treatment of opiate dependence. Suboxone combines bupe with naloxone, the drug that paramedics use to revive overdose victims. These medications are what’s called partial agonists which means they have a ceiling on how much effect they can deliver, so extra doses will not make the addict feel any different.

Whereas generic buprenorphine can produce a high if injected, Suboxone was formulated to be more difficult to manipulate. If an addict uses it improperly by injecting it, the naloxone kicks in and can send the person into withdrawal — the opposite of a good time.

In the U.S., the more abuse-resistant Suboxone dominates the market, making it the most widely prescribed of the medically assisted treatments for opioid addiction.

Neither Suboxone nor methadone is a miracle cure. They buy addicts time to fix their lives, seek out counseling and allow their brains to heal. Doctors recommend tapering off the medication only with the greatest of caution. The process can take years given that addiction is a chronic disease and effective therapy can be a long, grueling affair. Doctors and researchers often compare addiction from a medical perspective to diabetes. The medication that addicts are prescribed is comparable to the insulin a diabetic needs to live.

“If somebody has a heroin dependence and they did not have the possibility to be offered methadone or Suboxone, then I think it’s a fairly tall order to try and get any success,” said Dr. Bankole Johnson, professor and chair of the Department of Psychiatry at the University of Maryland School of Medicine. “There have been so many papers on this — the impact of methadone and Suboxone. It’s not even controversial. It’s just a fact that this is the best way to wean people off an opioid addiction. It’s the standard of care.”

&ldquoThere have been so many papers on this &mdash the impact of methadone and Suboxone. It’s not even controversial.&rdquo

But as the National Center on Addiction and Substance Abuse study pointed out, treatment as a whole hasn’t changed significantly. Dr. A. Thomas McLellan, the co-founder of the Treatment Research Institute, echoed that point. “Here’s the problem,” he said. Treatment methods were determined “before anybody really understood the science of addiction. We started off with the wrong model.&rdquo

For families, the result can be frustrating and an expensive failure. McLellan, who served as deputy director of the White House’s Office of National Drug Control Policy from 2009 to 2011, recalled recently talking to a despairing parent with an opiate-addicted son. The son had been through five residential treatment stays, costing the family more than $150,000. When McLellan mentioned buprenorphine, the father said he had never heard of it.

Most treatment programs haven’t accepted medically assisted treatments such as Suboxone because of “myths and misinformation,” said Robert Lubran, the director of the pharmacological therapy division at the federal Substance Abuse and Mental Health Services Administration.

In fiscal year 2014, SAMHSA, which helps to fund drug treatment throughout the country, had a budget of roughly $3.4 billion dedicated to a broad range of behavioral health treatment services, programs and grants. Lubran said he didn’t believe any of that money went to programs specifically aimed at treating opioid-use disorders with Suboxone and methadone. It’s up to the states to use block grants as they see fit, he said.

Kentucky has approached Suboxone in such a shuffling and half-hearted way that just 62 or so opiate addicts treated in 2013 in all of the state’s taxpayer-funded facilities were able to obtain the medication that doctors say is the surest way to save their lives. Last year that number fell to 38, as overdose deaths continued to soar.

Waiver Required

Federal waivers are required for doctors to prescribe buprenorphine products like Suboxone. Kentucky has 518 doctors with such waivers, most clustered around cities like Louisville and Lexington.

In multiple states struggling to manage the epidemic, thousands of addicts have no access to Suboxone. There have been reports by doctors and clinics of waiting lists for the medication in Kentucky, Ohio, central New York and Vermont, among others. In one Ohio county, a clinic’s waiting list ran to more than 500 patients. Few doctors choose to get certified to dispense the medication, and those who do work under rigid federal caps on how many patients they can treat. Some opt not to treat addicts at all. According to state data, more than 470 doctors are certified in Kentucky, but just 18 percent of them fill out 80 percent of all Suboxone prescriptions.

There’s no single explanation for why addiction treatment is mired in a kind of scientific dark age, why addicts are denied the help that modern medicine can offer. Family doctors tend to see addicts as a nuisance or a liability and don’t want them crowding their waiting rooms. In American culture, self-help runs deep. Heroin addiction isn’t only a disease – it’s a crime. Addicts are lucky to get what they get.


Environmental impact:

Due to absence of production process the environmental impact is minimal in nature. The business is based on the application based operation and thus, it does not affect environment. Moreover, the business indirectly influences in minimizing the environmental hazards through minimizing scattered parking and thus, traffic congestion.

Community impact:

Local residents and shop keepers will be benefitted from the online parking allotment. The shopping and resident premises will be parking free and thus, the society development and upgradation will be eased.

The anticipated risk factors associated to the business are:

  • Unavailability of start-up funds
  • Loss of assets
  • Unavailability of ample number of parking slots
  • High rental costs to tie up with the parking owners

Different strategies like proper handling of assets, effective awareness of online parking will be the prime strategy to avoid such uncertainty and its consequences.


8 Ways To Be More Proactive In Life (+ Examples)

Consult a life coach to help you be more proactive in your life. Simply click here to find one now.

Proactive (adj): taking action by causing change and not only reacting to change when it happens.

We’re often told that being proactive is the best approach to life.

That we should take the bull by the horns and in that way rise above mediocrity to a new level of success, both in our career and in our personal life.

If you’ve ever read Stephen Covey’s influential bestseller 7 Habits of Highly Effective People, you’ll know that the first ‘habit’ is Be Proactive, Not Reactive.

It’s interesting that the concept of proactivity has given rise to another buzzword of our age: empowerment.

This makes complete sense because it’s impossible to feel empowered if you’re merely reacting to events.

You need to be firmly in the driving seat to be sure that you have the power to influence your life.

There are three types of people in this world. Firstly, there are people who make things happen. Then there are people who watch things happen. Lastly, there are people who ask what happened? Which do you want to be?

Clearly, it’s the first kind of person who displays proactive behavior.

And the very fact that you’ve clicked on this article indicates that you’re keen to learn more about this potentially life-changing quality.

You’d like to be someone who makes things happen.

And why not? Being proactive is undoubtedly an attractive quality to have.

Let’s face it, if you think about the people you admire the most, chances are it’s not those who react to change when it happens, or those who just roll with the punches while wondering what happened…

…it’s those who take control and actively get stuff done who stand out.

How about becoming one of those people?

With a bit of helpful guidance, it’s not that difficult to switch your mindset.

Rather than being a passive person who takes whatever life throws at you, you can become an active participant in the ups and downs, potentially with the power to control and direct them in a way that’s favorable to you.

But what if that’s not your pre-programmed personality type?

What if you think you lack either the ideas or the initiative to be that proactive person who is so sought after, especially in the world of business?

And what if you feel that your life outside of work could do with a little more proactivity?

What if your default setting is passive acceptance of the status quo, merely reacting to stimuli when needs must?

Well, if you’re now ready to break out of the cycle of being a passive receiver, there is good news: proactivity isn’t some mysterious gift that we either possess or not.

Everyone has the potential to be the kind of person who makes things happen.

It’s actually a habitual mindset we can develop and strengthen over time.

But first, let’s clarify something…


Social Darwinism in Western History and Politics

It is crucial to note the implicit and overt role that Social Darwinism continues to play in the imperialist appetites of Westernized powers, which are led by the U.S. in its global mission of conquering, which is cleverly euphemized as conducting a “War on Terrorism”. U.S. involvement in places such as the Middle East has traditionally been based on its lucrative natural resources. In the 1970’s America turned towards the Middle East due to gas shortages and accordant high prices it embarked on Operation Desert Storm in the final decade of the century in order to lower gas prices and gain better access to oil.

Social Darwinism can justify both of these occurrences, and has been so implicitly and subtly ingrained in Westernized culture that it is now simply an afterthought to any aggressive military-based operation. The rationale for U.S. involvement in these aforementioned incidents is quite simple—the country views these natural resources as vital to its survival, and therefore believes that it has the military and economic strength to force those in control of such resources in foreign lands to comply with American wishes. When they are slow to do so or less than cooperative in the terms of their compliance, military force ensures which may result in a lasting occupancy, as more recent events in Iraq and Afghanistan seemingly indicate. The relevance of Social Darwinism in these forms of imperialism is easily discerned in the subsequent quotation.

Imperialism

The application of Darwin’s biological concepts to the social world…buttressed imperialism, racism, nationalism, and militarism—doctrines that preached relentless conflict. Social Darwinists insisted that nations and races were engaged in a struggle for survival in which only the fittest survive and deserve to survive (Perry).

Thus, America certainly feels justified in demanding resources and taking military action that is necessary to set up economic and traditional methods of imperialism as simply a means of survival. Its current “War on Terror” provides an excellent example of this fact. The U.S. has occupied and enforced its own form of government in both Afghanistan and Iraq in events that can be traced to the destruction of the World Trade Center. Doing so is simply contemporary imperialism in the traditional sense in which countries appropriated foreign territory as their own. Afghanistan was reportedly linked to the origin of the World Trade Center destruction as mandated by Osama Bin Laden. That fact alone does not warrant America’s continuous presence there to this day.

The example of Iraq is even more revealing of the imperialist desire of the U.S. The latter invaded the former reputedly because Iraq possessed weapons of mass destruction—which were never found, if they were even looked for at all. Instead, the previous leader of the country’s government, Saddam Hussein, was unseated from his position of power and killed, while the U.S. implemented its own democratic form of government. This aspect of imperialism directly relates to Social Darwinism. What this theory presupposes is a difference in quality—with the practice and methods of Westernized culture being inherently superior—is actually just a difference in type. However, it is Westerners who ascribe the superiority of their culture, a tendency that Darwin himself exhibits in the following quotation in which he describes indigenous foreigners. “These men were absolutely naked and bedaubed with paint…they possessed hardly any arts, and, like wild animals, lived on what they could catch they had no government, and were merciless to everyone not of their small tribe” (Burton and Dworkin).

This quotation expresses cultural and sociological differences. However, Darwin, in his Westernized ideal perspective, ascribes a value judgment based on these differences, in which these foreigners are less developed (in the evolutionary scale), less intelligent and less capable of self-determination than Europeans. This same line of thinking continues among Social Darwinians today, and certainly applies to the spread of imperialism through the War on Terror (in Iraq in particular) and justifies the use of force for material gains. Social Darwinism has been thoroughly ingrained in Westernized culture, and readily facilitates any imperialist actions on the part of Westerners.


Introduction

The world in which we currently live confronts people responsible for making decisions about security with very challenging issues. These issues call for sophisticated logical and statistical analysis, detection and forecasting systems, cost-benefit analysis, and the like. However, the crux of security is the necessity of dealing with the prospect of potential danger. Because potential dangers have had very substantial consequences for reproductive fitness for many thousands of years, evolution has shaped brain systems specially adapted for managing them. Thus, in addition to the logical armamentarium that present-day decision-makers bring to issues of security, they inevitably bring the intuitions and motivations that are generated by a biologically ancient, “hard-wired” system.

This potential-threat system in the brain has been termed the defense system (Trower et al., 1990) and the hazard-precaution system (Boyer and Lienard, 2006). In our own work, we have called it the security motivation system (Szechtman and Woody, 2004). Our research investigating this system has focused on its role in everyday circumstances, such as behavior to manage threats of contagion due to dirt and germs, and in pathological variants of these behaviors, such as the compulsive hand-washing seen in obsessive-compulsive disorder (OCD). However, it is likely that the influence of the security motivation system extends well beyond such relatively mundane circumstances. The purpose of this perspective article is to explain briefly what we know about the security motivation system and to advance the following question: Does this biological system affect policy-making about security in important ways? We hope to stimulate the thinking of researchers who investigate security-related decision-making, in particular by sketching some of the kinds of hypotheses that could be examined in such research.


‘Fear of not delivering’

Climbing to the top of the peloton’s hierarchy is difficult. The fear of falling off that pedestal can be crushing.

Tyler Farrar lived that emotional rollercoaster during his 13-year pro career. After turning professional at 19, he enjoyed the best-ever run by an American sprinter from 2009-2011, winning stages in all three grand tours, and other major races. Then the wins stopped coming. Farrar spent years wondering why he was unable to challenge André Greipel and Mark Cavendish, as he had done before.

“It was hard, but as a pro, you have to be able to make an honest assessment,” Farrar says. “The problem was that I wasn’t getting any slower. My power was the same. Times had changed. The races had changed. Today, there are three climbs in the last 50km. I like flat roads.”

Desperate to chase results, Farrar began taking more risks and crashing more often. It was a high-wire act, and Farrar experienced first-hand the emotional, physical, and mental strain that every pro endures. After a few subpar seasons, he embraced a chance to become a team captain and domestique at the expanding Dimension Data team. Rather than race to win, he raced to help his teammates win.

Farrar was lucky that he could find a new role and extend his career. Riders are constantly squeezed and pressed by the media and sponsors to win, no matter the cost. Often, there’s no support structure for those riders when they lose.

“You’re around people a lot, but it can also be very isolating,” Farrar says. “It’s every man for himself in a lot of ways.”

There is a thin line between injury — or worse — and glory. Straddling that line requires a special mental toolbox. Measuring risk, overcoming apprehension, managing the anxiety that comes with it, and eventually embracing the fear is as essential to becoming a professional cyclist as one’s VO2max.

Facing such pressures, it’s no surprise that professional cyclists often adopt neurotic behavior. Many are prone to extreme emotions. Since cycling is such a mentally taxing profession, it’s surprising how little attention has been paid to that aspect of performance. Many agree more could be done.

“Some riders fear success,” Keim says. “They’re the underdog, and it becomes their identity. Then they do well, and they have to keep it up. They become their own worst enemy. Others have the unique ability to really step up when the pressure is on to perform. Every athlete has their own unique qualities and challenges.”

Professional racing can be a nomadic experience, with pros living out of suitcases and in hotels for months on end. They’re often isolated, and living in a cocooned existence that is unlike any other athletic endeavor. Race, eat, sleep, recover, and then repeat. That’s their day-to-day mantra. Inside that seemingly benign routine, cyclists face real or imagined demons. There’s often a lack of introspection. Results count whining doesn’t.

During the off-season, Farrar would retreat to the mountains of Washington to get away from the chaos of racing. Hunting and exploring were his salve. After retiring from his racing career, despite having one year left on his contract, Farrar began preparing to be a firefighter. He wanted to give “something back” to his community after dedicating most of his life to the self-serving pursuit of racing.

“I didn’t want to be the guy who stuck around too long, and just hung on for dear life in his career,” Farrar says. “Cycling was always my dream, but the one thing I struggled with in being a pro athlete is that it’s a pretty selfish existence.”

Farrar is something of an oddity in the peloton: a pro who retired on his terms with sound body and mind.

Tyler Farrar. Photo: Tim De Waele | Getty Images


11 Answers 11

I think the golden rule here is "Everything in moderation".

If you are fairly certain a piece of code is going to prove to be a bottleneck, its not a horrible practice to do some initial optimization. At the very least, its a good idea to take steps to make sure it will be easy to refactor later.

What you want to avoid is going overboard by sacrificing time and readability in the name of micro-optimizations before you've seen actual data to justify such an effort.

The line between "optimizing" and just "sensible design" is sometimes fairly fine, but other times pretty obvious. Just for example, you don't need a profiler to be pretty sure that if you're sorting a few million items, it's worth using an O(N log N) algorithm rather than an O(N 2 ) algorithm. IMO, that just falls under being reasonable sensible, not optimization though.

There are also some things you might as well do, simply because they might provide a benefit, and the cost is minimal to nonexistent. To use @unholysampler's example, writing ++i instead of i++ may have some minuscule cost if you're accustomed to typing it as a post-increment, but (at most) it's temporary and trivial. I wouldn't spend any time rewriting working code for the sake of possibly saving a nanosecond, unless the profiler had shown that I really needed that time, and stood a reasonable chance of saving there. At the same time, when I'm just typing in new code, I'd work at habitually using the form that's likely to be better, because once you do so habitually it's free. It frequently won't gain anything, and even when it makes a difference it often won't be large enough to notice or care about -- but it's still free, so there's not reason not to do it.

Cases like those are fairly obvious, and in my mind would really fall under sensible design rather than what I'd think of as truly optimization. Most other cases of "optimization without representation" would be considerably harder to justify though. If you're going to spend any significant time or effort on the optimization, you should have something much more solid than a "gut feel" to justify it.

I should add that part of the reason I say that is that I think profiling code is extremely useful, even when your goal isn't optimization. A profiler gives a high-level overview of a piece of code that can be extremely useful even when you don't particularly care about optimization at all. Just for example, if you see 10x as many calls to allocate a resource as to free that type of resource, it's a pretty good clue that there's a real problem, even if the code currently seems to run just fine. When you get down to it, a lot of code has a lot of things that should match up (not necessarily 1:1, but somehow or other) and a profiler can show mismatches like that much more quickly than most other tools.

The point of profiling before optimizing is that you need a baseline to determine how much improvement the optimization gave you. The "surprising" information I get from my profiler will be along these lines:

  • I knew that method was called a lot, but I didn't realize it was called that much.
  • Thread monitor contention almost never slows things down where I expect it.
  • I'm generating that many instances of XYZ? It should only be about n .

That said, many times the profiler merely confirms my suspicions. Good scientific method involves healthy doses of monitoring and experimentation. It's a bit difficult, but I do try to figure out more systemic problems with the profiler. I could do the obvious change and get a 5% improvement, or if I approached the problem differently I might be able to get a 25% improvement. (Most optimizations don't yield such a large improvement, but on occasion they do.) Of course, I wouldn't know how much my optimization improved the performance without a baseline measurement.

"I can sense are going to be bottlenecks" (sic)

The problem with this statement is observer error. Just because you think it may be bad doesn't mean it IS bad. The profiler will give empirical evidence and keep you from spending time in an area that may give no improvement. That's why you should start with the profiler before optimizing. Let the program and profiler PROVE that it is a slow section first.

There are known performance killers and known best practices which avoid them. I don't think you need to profile to determine a cursor is slower in SQL Server than a set-based operation. If you start out knowing this, you will write code that performs better from the start without the need to profile the two options every time you write code.

If you are ajusting existing code, it is better to profile not only so you can be sure that the code you know is inefficient in general is working badly, but it can also show up problems you didn't know you had. I remember profiling one thing where I suspected the stored proc could be optimized (it could) and found out that it being called hundreds of times by the application when it only needed to be called it once.

Additionally, you have the benefit of being able to prove that you did in fact improve performance when you profile. I personally save these before and after figures and use them in my performance appraisal write-ups and in discussing my achievements when job hunting. It's nice to have actual figures to show how good you are at tuning.

The reason people say that is because you won't know what needs to be optimized before your profiling is finished, so you could end up wasting a lot of time for no benefits.

Still, if something leaps out at you as a bad idea (e.g. your coworker chooses to use deep recursion instead of a simple iterative loop) you can fix it. But you should be focused on moving forward, not naval gazing on old code. There is a stage in the process where that is appropriate.

I think everyone tries to write good code from the beginning. That sounds like what you're doing.

What I think should be done in the beginning is to just keep the design simple, especially the data structure. Often people start off assuming they need more sophisticated data structure, redundant data, and detailed notification techniques because they are worried about performance. In my experience, those things cause the problem they are supposed to avoid.

In spite of good coding practice and good design, performance problems creep in, and you need to remove them periodically. These are almost never things you could have guessed, and most profilers are not very good at finding them either. What's more, the optimization level of the compiler seldom has any effect on them, because mostly they are not tight compute-bound loops. Mostly they present as innocent-looking (or even invisible) function calls that, if you randomly snapshot the stack, are in the middle of it, and are consuming way more wall-clock time than you ever would have imagined, as shown by how often they appear there.

There are two kind of optimisation that can (relatively) safely be done before profiling.

Algorithmic optimisation: choosing an algorithm with better average (or worst-case) complexity. This can even (should?) be done before beginning coding. You'll still have to check that the selected algorithm is the correct one given your real data set, but it is a good idea to start with an algorithm that is expected to fare better, isn't it?

Data structure optimisation: laying out your data correctly, or using a data structure with better locality can increase your performance, but it will have an impact on the algorithms that can be used, so it is easier to do such an optimisation before coding (and thus you cannot use a profiler if there is no code). For example, when programming a video game, it is generally better to use struct-of-array (SoA) instead of array of struct (AoS) to store data as it will benefit from data locality, cache coherency, .

Yes it wrong to optimize before profiling BUT

Good programming, programming that makes code simpler and more straightforward doesn't require profiling. Good programming like moving unneeded initializations out of loops does not need anymore justification than the fact that you are improving the quality of the code.

IMHO the only times you need to profile is when you are specifically looking to improve performance. In order to show the improvement you must baseline first, and then show the delta.

Anytime you add complexity under the guise of optimization without showing proof that it was a bottleneck and the code does improve performance is just doing bad programming.

The critical distinction is between:

The optimized code is as simple or simpler than the un-optimized.

The optimized code is more complex (and therefore more error prone and harder to modify in the future) than the un-optimized code.

In the first case, sure go ahead. In the second case you have to weigh the investment in development time (including opportunity cost on not using the same time to fix bugs or deliver features) and the future higher cost of maintenance for a more complex solution. You have to weigh this cost against the observable improvements to performance. How will you perform this judgement if you have no idea what the performance cost is? A function might be obviously inefficient, but if it only takes a few milliseconds anyway, a 1000x performance optimization will not provide any value. Not to mention the opportunity cost of not working on an optimization where it actually matters.

Second, your intuition about performance might very well be wrong - and you will never know if you "optimize" before measuring. For example many developers tend to think that say a O(log n) algorithm is faster than a O(n). But you don't know that. The O(n) algorithm might be faster as long as n is below some threshold. What is that threshold? You probably don't know. And what is n actually in your particular program? Is it usually above or below this threshold? How will you find out?

What if you intuition is wrong and you optimization actually makes the program run slower? If you profiled before and after, you realize your mistake and roll the changes back. At worst you wasted some time. If you don't profile you are none the wiser and have caused long term damage to the program.

The really difficult decision is when you can go down different roads in the architecture. Should the network communication be JSON over HTTP(simple) or protocol buffers over TCP (performant)? The problem here is you have to decide up front, before you can even measure performance, if you don't want to waste work by having to change protocol later. In this case you cannot just start out with the simple version and then optimize later when it turns out to be a problem. But you shouldn't just choose the most performant version by default either. You will have to do some educated guesses and projections.

I note you state that profiling "as often as not" gives the same result as you intuition based on your understanding of the program. I take that to mean that you have a 50% success rate in predicting the best way to allocate resources for optimizing. Given the short and long term cost of misapplied optimization, that is not really a very good rate to rely on.


Top 30 Innovations Of The Last 30 Years

Imagine this is 1979: If you were reading this article back then, chances are you would have read it on paper--with a printed newspaper or magazine in your hands. Today, you are probably reading it on a desktop computer, a laptop (or as a printout from either of these) or perhaps even on your BlackBerry or iPhone. The pace of innovation has been so hectic in recent years that it is hard to imagine which innovations have had the greatest impact on business and society.

Is it possible to determine which 30 innovations have changed life most dramatically during the past 30 years? That is the question that Nightly Business Report, the Emmy Award-winning PBS business program, and [email protected] set out to answer to celebrate NBR's 30th anniversary this year. NBR partnered with [email protected] to create a list of the "Top 30 Innovations of the Last 30 Years." The show's audiences from more than 250 markets across the country and [email protected]'s readers from around the world were asked to suggest innovations they think have shaped the world in the last three decades.

After receiving some 1,200 suggestions--everything from lithium-ion batteries, LCD screens and eBay to the mute button, GPS and suitcase wheels--a panel of eight judges from Wharton reviewed and selected the top 30 innovations, which were revealed on air and online Feb. 16.

The list is as follows, in order of importance:

1. Internet, broadband, www (browser and html)

5. DNA testing and sequencing/human genome mapping

6. Magnetic Resonance Imaging (MRI)

9. Office software (spreadsheets, word processors)

10. Non-invasive laser/robotic surgery (laparoscopy)

11. Open-source software and services (e.g., Linux, Wikipedia)

13. Liquid crystal display (LCD)

15. Online shopping/e-commerce/auctions (e.g., eBay)

16. Media file compression (jpeg, mpeg, mp3)

18. Photovoltaic solar energy

19. Large- scale wind turbines

20. Social networking via the Internet

21. Graphic user interface (GUI)

22. Digital photography/videography

23. RFID and applications (e.g., EZ Pass)

24. Genetically modified plants

26. Bar codes and scanners

30. Anti-retroviral treatment for AIDS

Before the winners could be selected from the vast number of entries, the Wharton judges first had to define what innovation means in an age dominated by digital technology, medical advancements and mobile communications. The judges included Ian MacMillan, director of the Sol C. Snider Entrepreneurial Research Center Thomas Colligan, vice dean, Wharton Executive Education Kevin Werbach, professor of legal studies and business ethics Karl Ulrich, chair, operations and information management department Franklin Allen, co-director of the Wharton Financial Institutions Center George Day, co-director of the Mack Center for Technological Innovation Lori Rosenkopf, professor of management and Mukul Pandya, editor in chief of [email protected]

"Innovation is a surprisingly hard word to define," says Werbach. "Everyone thinks they know it, but when you ask them to explain exactly what an innovation is, it gets very hard." In order to achieve the best results and narrow down the most authentic list of winners, Werbach and his fellow judges defined innovation as more than simply a new invention. "It's something new that creates new opportunities for growth and development," he says, citing cellular technology, which ranks No. 3 on the list. "We've gone from zero to close to three-and-a-half-billion people who have a mobile device and are connected to each other."

Another qualification the judges used to highlight the most sophisticated, powerful innovations was problem-solving value, says Ulrich. "Almost all product design is, in fact, innovation, but the converse is not true," he adds. "Many successful innovations begin with a user need. Some innovations occur because of some serendipitous event or some scientific discovery. The innovator goes and looks for the user and looks for an application of the technology."

An example in the pharmaceutical industry is the development of new chemical compounds to treat medical conditions, as seen in No. 30 on the list, anti-retroviral treatments for HIV and AIDS. "We don't think of that as a product design," says Ulrich, "but we would think of it as an innovation."

Hardly a surprise, the Internet--combined with broadband, browsers and HTML--was ranked first in a list dominated by technological and medical advancements. MacMillan notes that the Internet is an innovation that created an industry and subsequent new technologies, making it an especially important category. "Some [innovations] are more transient and come and go very quickly," he says. "To me, the ones that really matter are the ones that generate whole new industries."

Colligan credits the technology with improving communications and enhancing the standard of living and working, regardless of one's location. "Technology has leveled the playing field," he says, adding that he is not surprised so many innovations fall under the technology category. "It's brought populations that were in poverty, frankly, up to certainly a better standard of living. It's allowed others to enter the workforce in the new global environment."

The panel of judges applied a specific set of criteria to narrow down the innovations in evolving technological and scientific fields. The innovations were selected based on how they impact quality of life, fulfill a compelling need, solve a problem, exhibit a "wow" factor, change the way business is conducted, increase efficiency, spark new innovations and create a new industry.

Day says the Internet ranked high, along with mobile computing and telecommunications devices, because of the way this collective of innovations connects people, saves time and creates mobile access points for knowledge. "The Internet took away a major constraint to accessing knowledge and sharing knowledge," he says. "But a bigger innovation is one that spawns other innovations."

Almost every aspect of business or social relations today is touched by the Internet and the subsequent industries the platform has created on an international scale. "It's hard to imagine tackling a challenge like bringing clean water and good health care to the largest number of people possible in the developing world without using the Internet and the technologies around it," says Werbach. "It's not just a business phenomenon. It's a central organizing platform for anything you can think of."

Werbach also says laptop computers, ranked No. 2, are related to the Internet, thanks to connectivity in the digital realm. "The computer is not something that is in a specific place (i.e., your office)," he says. "It changes the nature of interaction." And it connects with multiple devices that have been created in the last 30 years, including digital cameras, digital music players and wireless printers.

Innovations in Health Care

Many of the innovations capitalize on existing technology to flourish. In some cases, the results not only demonstrate measured success now among select innovations, but also focus on categories that promise even greater success in the future. Most of the scientific selections--including drug developments, surgical advancements and new diagnostic tools--have the potential to spur greater innovation within the next few years to extend life and cure disease. Within the top 10 alone, DNA testing and sequencing, human genome mapping, Magnetic Resonance Imaging (MRI) and non-invasive laser and robotic surgery (laparoscopy) are included.

"DNA has a huge promise to improve diagnoses," says Day, adding that DNA testing and sequencing ranked at No. 5 because of its ability to enhance the pharmaceutical industry by spawning more effective drugs based on genetic factors that have been impossible to determine without it.

Many innovations on the list also subscribe to a "wow" factor, or characteristics that somehow make the innovation surprising, unusual or unexpected, which becomes more difficult to gauge the longer an innovation is used and the more familiar it becomes. But the wow factor, says Ulrich, is important for two reasons: to grab a user's attention and to erect a barrier between it and the competition.

Colligan says this form of competitive marketing and innovation is very much on the minds of the nation's top executives as a way to enhance business goals in a challenging economy. "Innovation creates new revenue streams," he says. "It's a mindset that needs to be started at the top of the organization to allow people to experiment and try different things. It's the opportunity to break through existing models that not only allow for new innovations, but also challenge executives in organizations that have that type of mindset to attract top talent."

Despite a few of the trends revealed in this listing, innovation is not restricted to consumer products organizations or the health care industry. "[Innovation] happens every day," says Colligan, "when executives are looking for solutions to a problem and consultants and professionals are putting together a team. The challenge that professional service firms have is that when good work is done, how do they replicate that?"

The current economic climate weighs heavily on the importance of these 30 innovations, especially as new technology is being used to preserve, and in some cases, revive the commercial landscape. "The innovations are in stark contrast to what we're going through in the economy right now," says Allen. "The innovations also point toward peoples' expectations about the future in the way they change the world." In this category belong the innovations in energy--such as photovoltaic solar energy, which clocked in at No. 18, and large-scale wind turbines (No. 19).

Allen compares these innovations to important strides in the early to mid-20th century, like antibiotics, aspirin, automobiles and improvements in radio technology. "Some things are really fundamental in the way they change the world," he says. "One would hope these 30 innovations would be just as important 30 years or more from now."


On Effects-based Operations, Biologival Evolution, and Some Other Interesting Stuff Lt. Col. Dr. Zoltan Jobbagy - PowerPoint PPT Presentation

Title: No Slide Title Author: Ellie Vrolijk Last modified by: HDI Created Date: 10/16/2003 7:06:10 AM Document presentation format: Diavet t s a k perny re &ndash PowerPoint PPT presentation

PowerShow.com is a leading presentation/slideshow sharing website. Whether your application is business, how-to, education, medicine, school, church, sales, marketing, online training or just for fun, PowerShow.com is a great resource. And, best of all, most of its cool features are free and easy to use.

You can use PowerShow.com to find and download example online PowerPoint ppt presentations on just about any topic you can imagine so you can learn how to improve your own slides and presentations for free. Or use it to find and download high-quality how-to PowerPoint ppt presentations with illustrated or animated slides that will teach you how to do something new, also for free. Or use it to upload your own PowerPoint slides so you can share them with your teachers, class, students, bosses, employees, customers, potential investors or the world. Or use it to create really cool photo slideshows - with 2D and 3D transitions, animation, and your choice of music - that you can share with your Facebook friends or Google+ circles. That's all free as well!

For a small fee you can get the industry's best online privacy or publicly promote your presentations and slide shows with top rankings. But aside from that it's free. We'll even convert your presentations and slide shows into the universal Flash format with all their original multimedia glory, including animation, 2D and 3D transition effects, embedded music or other audio, or even video embedded in slides. All for free. Most of the presentations and slideshows on PowerShow.com are free to view, many are even free to download. (You can choose whether to allow people to download your original PowerPoint presentations and photo slideshows for a fee or free or not at all.) Check out PowerShow.com today - for FREE. There is truly something for everyone!

presentations for free. Or use it to find and download high-quality how-to PowerPoint ppt presentations with illustrated or animated slides that will teach you how to do something new, also for free. Or use it to upload your own PowerPoint slides so you can share them with your teachers, class, students, bosses, employees, customers, potential investors or the world. Or use it to create really cool photo slideshows - with 2D and 3D transitions, animation, and your choice of music - that you can share with your Facebook friends or Google+ circles. That's all free as well!


Watch the video: Πως δημιουργήθηκε η ζωή; Μέρος Πρώτο! (May 2022).