Saturday, July 26, 2008

Battles and Bids Over Pay by Touch


Pay By Touch, with revenues of approximately $70 million and at its height some 750 employees, has grown mainly through acquisition. Between 2005 and 2007, it bought at least six companies, including rival biometrics firm BioPay, and CardSystems Solutions, an Arizona-based credit-card payment processor. In December, 2006, Pay By Touch paid $100 million in cash and stock to acquire original loyalty marketer S&H Solutions, the 110-year old company behind S&H green stamps.

With its fingerprint payment technology slow to pay off, Pay By Touch this year started focusing on other lines of business. It signed grocers including Ohio's Dorothy Lane Markets and Harps Food Stores of Arkansas to a loyalty marketing program aimed at offering personalized coupons and deals to consumers who scanned their fingerprints with an in-store kiosk. Consumers who joined the program received personal offers and coupons based on purchasing history. Another line of business aimed at using fingerprints to help make check-cashing more secure had gained some traction with small banks.

Control Battles

But the company still needed more cash to fund operations. Last February, Pay By Touch raised $163 million from three hedge funds—Plainfield, Och-Ziff Capital Management (OZM), and Farallon Capital Management. Plainfield secured its portion of the loan, worth about $50 million, with Pay By Touch shares owned personally by company founder Rogers. Those shares amount to a 20% ownership stake in Pay By Touch but carry "supermajority" voting rights that give the holder control of 64% of the voting shares, enough to control the company.

The loan agreement calls for Plainfield to assume control of Rogers' shares in the event of a default. On Oct. 15, Plainfield's court complaint declared Pay By Touch in default because it failed to deliver its 2005 audited financial results by an Aug. 31, 2007, deadline. That set off a volley of lawsuits and legal moves. Having assumed Rogers' voting power, Plainfield created a new board of directors for Pay By Touch, reinstating two directors that Rogers had suddenly fired on Oct. 11, and a third who had resigned on Oct. 12.

On Oct. 18, Pay By Touch's Delaware lawyers disputed the validity of Plainfield's action, citing a technicality in the company's bylaws. Plainfield then issued a new order that would have seated Plainfield's new board on Nov. 1. But late on the night of Oct. 31, four Pay By Touch employees filed an involuntary petition aimed at forcing the company into bankruptcy. Rogers has also sought personal bankruptcy, in a case filed in the same court on the same day. Rogers didn't respond to an e-mail seeking comment, and Pay By Touch declined to make him available for an interview.

Usually a bankruptcy filing stays other pending litigation, but the judge in the bankruptcy case has allowed the Delaware case to proceed. A Delaware judge has issued a status quo order, forcing the company into management under the care of a temporary custodian and a temporary board of directors. A trial over control of the company in Delaware is set for Dec 21.


For More Info

New DNS exploit now in the wild and having a blast


About two weeks ago, we covered the release of a DNS security fix meant to patch a vulnerability in the system that matches domain names with IP addresses. The flaw had been discovered by security researcher Dan Kaminsky some months earlier but, at the time, details on the exploit were being kept secret. That information has since leaked thanks to an accidental blog post by someone at Matasano Security. Fast forward four days, and hackers, enterprising little children that they are, have released an exploit aimed squarely at the vulnerability.

This would be less of an issue if the widely released patch from two weeks ago had been fully deployed, but a number of companies or ISPs don't seem to have gotten the memo. Accordingly to Kaminsky, some 52 percent of DNS servers are still vulnerable to the attack. This is a marked improvement from the 86 percent vulnerability rate in the days immediately following the patch's release, but it's still far too high, especially with dangerous code now squirreling its way across the Internet. Patch deployment is not an instant process, even if the company is on the ball, but we'll hopefully see the number of patched DNS servers skyrocket in the next few days.

Some publications have dubbed the attack Metasploit, but that term refers to the open-source Metasploit Framework that was used to develop it. As for the exploit itself, it's a new variation on a classic DNS poisoning theme. It disrupts the normal translation functions of a DNS server, causing it to redirect users to websites other than the ones they intended to visit. A poisoned DNS server, for example, could send someone to www.RussianMalware.com when they had actually typed www.google.com into the address bar. DNS poisoning isn't new—vulnerabilities have existed for over a decade—but the one Kaminsky discovered increases the power of a successful attack.

Kaminsky has now detailed the methodology of a standard DNS poisoning attack and provides additional information on the vulnerability he discovered. As he describes it, a DNS lookup request is essentially a race between a good guy and a bad guy, each of whom possess certain advantages. The good guy knows when the race begins, and he knows the secret code that's been sent along with that request in order to verify that the response coming back is actually authentic. The bad guy doesn't have this code, but he actually decides when the request goes out, and he knows about the request before the good guy does.

Normally, the good guy wins the vast majority of these races, and the bad guy is forced to race again and again in an attempt to guess the right authentication value before the good guy provides correct information. What Kaminsky discovered, and what the new hack exploits, is a vulnerability in the recursive nature of the DNS system. DNS is designed to "bump" your request along until it reaches a server that can answer the client's request. If you ask www.DNSTarget.com for a location it doesn't know, DNSTarget.com can refer you to A.DNSTarget.com, B.DNSTarget.com, and so on, until it finds the requisite information. A.DNSTarget.com is what's called an "in-bailiwick" relative to DNSTarget.com—the information that comes back from that server is automatically trusted and passed on.

Therein lies the problem. Instead of launching an attack straight at www.dnstarget.com and losing 99 percent of the time, the bad guy attacks one of the recursive in-bailiwick servers and then feeds it false information. The in-bailiwick server communicates that data back to DNSTarget.com, which then caches the response—that way, it doesn't need to look the information up again. Problem is, the server has cached poisoned information and doesn't know it. Until that information drops out of the server's cache, the bad guy has effectively won the race.

For More Info

Digital Domain First It Was Song Downloads. Now It’s Organic Chemistry.


AFTER scanning his textbooks and making them available to anyone to download free, a contributor at the file-sharing site PirateBay.org composed a colorful message for “all publishers” of college textbooks, warning them that “myself and all other students are tired of getting” ripped off. (The contributor’s message included many ripe expletives, but hey, this is a family newspaper.)

All forms of print publishing must contend with the digital transition, but college textbook publishing has a particularly nasty problem on its hands. College students may be the angriest group of captive customers to be found anywhere.

Consider the cost of a legitimate copy of one of the textbooks listed at the Pirate Bay, John E. McMurry’s “Organic Chemistry.” A new copy has a list price of $209.95; discounted, it’s about $150; used copies run $110 and up. To many students, those prices are outrageous, set by profit-engorged corporations (and assisted by callous professors, who choose which texts are required). Helping themselves to gratis pirated copies may seem natural, especially when hard drives are loaded with lots of other products picked up free.

But many people outside of the students’ enclosed world would call that plain theft.

Compared with music publishers, textbook publishers have been relatively protected from piracy by the considerable trouble entailed in digitizing a printed textbook. Converting the roughly 1,300 pages of “Organic Chemistry” into a digital file requires much more time than ripping a CD.

Time flies, however, if you’re having a good time plotting righteous revenge, and students seem angrier than ever before about the price of textbooks. More students are choosing used books over new; sales of a new edition plunge as soon as used copies are available, in the semester following introduction; and publishers raise prices and shorten intervals between revisions to try to recoup the loss of revenue — and the demand for used books goes up all the more.

Used book sales return nothing to publishers and authors. Digital publishing, however, offers textbook publishers a way to effectively destroy the secondary market for textbooks: they now can shift the entire business model away from selling objects toward renting access to a site with a time-defined subscription, a different thing entirely.

The transition has already begun, even while publishers continue to sell print editions. They are pitching ancillary services that instructors can require students to purchase, just like textbooks, but which are available only online on a subscription basis. Cengage Learning, the publisher of Professor McMurry’s “Organic Chemistry,” packages the new book with a two-semester “access card” to a Cengage site that provides instructors with canned quizzes and students with interactive tutorials.

Ronald G. Dunn, chief executive of Cengage Learning, says he believes the printed book is not about to disappear, because it presents a large amount of material conveniently. Mr. Dunn predicted that textbook publishers were “headed for a hybrid market: print will do what it does best, and digital will do what it does best.”

Whether students will view online subscriptions as a helpful adjunct to the printed textbook or as a self-aggrandizing ploy by publishers remains to be seen.

As textbook publishers try to shift to an online subscription model, they must also stem the threat posed by the sharing of scanned copies of their textbooks by students who use online publishing tools for different purposes. The students who create and give away digital copies are motivated not by financial self-interest but by something more powerful: the sweet satisfaction of revenge.

Mr. Dunn says that online piracy is “a significant issue for us.” His company assigns employees to monitor file-sharing sites, and they find in any given month 200 to 300 Cengage textbook titles being shared. The company sends notices to the sites, demanding that the files be removed and threatening legal action.


For More Info

Friday, July 18, 2008

The Future of Biotechnology

[This lecture is based on interviews with 150 of the world’s top scientists, many of them Nobel Laureates or directors of major scientific laboratories, about their conception of the science of the next 20 to 50 years. Many of these predictions are contained in my book, Visions: How Science Will Revolutionize the 21st Century. Of course, errors will be made, but the predictions in this article are not mere idle speculation, but reflect a fairly accurate description by the experts in biotechnology about the evolution of their field.]

Back in the 1980s, when the idea of the Human Genome Project was first proposed by a handful of biologists, the overwhelming reaction was negative, with scientists arguing that it would be prohibitively expensive and would consume too much time and resources. Only a handful of genes had been sequenced, at great expense, and many felt that a crash project to sequence the entire human genome would be impractical and adversely affect funding for other worthwhile projects.

Today, we realize that many of these pessimistic predictions were incorrect in part because of Moore’s Law. The biology of gene sequencing has now been automated and roboticized, with the power of computers doubling every 18 months and results being shared instantly on the internet. This is one of the most important factors driving the ever-accelerating pace of biotechnology. This, in turn, has translated into a new Moore’s Law for biotechnology: that the number of genes which are sequenced doubles every year. This means that the cost of sequencing a DNA base pair went down from $5 per base pair to a few cents today. Within 20 years, we may have personalized DNA sequencing and also an “encyclopedia of life” in which all major life forms are decoded.

This new Moore’s Law, in turn, allows one to make rough predictions about the progress of biotechnology into the next 20 years. Although predictions mentioned here are inevitably based on incomplete information, they will hopefully serve as a useful guide to make plausible projections for the future.


For More Information

Nabbed for speeding? GPS data could get you off the hook

As anyone who has ever gotten a speeding ticket knows (full disclosure: I never have... knock on wood), you often have very little ammo to fight back against the reading that the all-knowing radar gun spat out. But thanks to more sophisticated and affordable technology, that could be changing. GPS data was able to get a California teen off the hook for allegedly going 17 miles per hour over the speed limit, simultaneously casting doubt on the accuracy of police radars and giving hope to tech-savvy drivers.

The story started out simply enough. 18-year-old Shaun Malone was caught by a police radar going 62 in a 45 mph zone last summer. Under most circumstances, most people would assume that this was all simply true—it's not unheard of for teenagers to speed, after all. Malone's parents knew differently, though. It turns out that they had installed a GPS device from Rocky Mountain Tracking in his car in order to monitor his driving behavior.

But this was far more sophisticated than your everyday "turn left at the stop light" kind of GPS device—it tracked his speed, sending signals every 30 seconds, and was even capable of sending e-mail alerts to Malone's parents if Shaun ever exceeded 70mph. (I'm thanking my lucky stars right now that my parents didn't have access to this technology when I was a teenager.) According to the data from Shaun's GPS device, he and his parents argued that he was going exactly 45mph at almost the exact time the police radar clocked him going 62.

For More Info

Saturday, July 12, 2008

P&G backs out of drug development deal with ARYx Therapeutics

Singapore, July 7, 2008: Fremont, California-based biopharmaceuticals company ARYx Therapeutics has said that American major Procter & Gamble Pharmaceuticals has backed out of a collaboration agreement between the companies.
P&G utilized a one-time thirty-day cancellation option linked to the completion of a Thorough QT (TQT) study to end the collaboration agreement between the companies covering the late-stage development and commercialization of ATI-7505, a prokinetic agent in Phase 2 clinical trials for chronic constipation and functional dyspepsia.
"We are surprised P&G would cancel our collaboration after receiving the results of the TQT study given that we achieved a successful result at the study's primary endpoint, and are disappointed in their decision to return the rights to ATI-7505 to us. We have been informed by P&G their decision is based on their view of certain commercial and technical criteria, and that the program no longer fits into their future plans. This decision by P&G does not in any way diminish our confidence in ATI-7505, and we believe the results from the TQT study, along with continued positive clinical and preclinical data, will allow moving ATI-7505 into late-stage development once the program is in the hands of a new partner," said Dr. Paul Goddard, Chief Executive Officer and Chairman of ARYx Therapeutics.
As part of thed eal, according to a company release, the results from the just-completed TQT study qualified ARYx for the Tier 1, or highest, milestone payment from P&G. The collaboration agreement provides P&G a thirty-day option period from the lock of the database for the TQT study to cancel the collaboration or agree to pay ARYx the milestone payment provided for under our collaboration agreement. P&G has exercised their option to cancel the agreement effective immediately. However, P&G has also been very supportive in agreeing to a transition plan for handing the program back to ARYx to allow it to pursue an optimal partnering package.
As part of the transition plan, the on-going phase 2 studies in chronic constipation and functional dyspepsia will be terminated in an orderly method. No new patients will be enrolled and those currently on therapy will be withdrawn from the studies over the coming weeks, the release added.

For More Info

Wednesday, July 9, 2008

Detect Spy Cameras


Malaysia declared free from deadly bird flu virus

Malaysia has been declared free from avian influenza, three months after the deadly virus was detected in poultry from a village in the central Selangor state, a minister announced Monday.In June, the H5N1 strain of bird flu was discovered after a poultry rearer from the Paya Jaras Hilir village in Selangor, next to the capital Kuala Lumpur, reported that 60 of his chickens had suddenly died over three days.

Health ministry officials immediately screened villagers and conducted checks on all birds and poultry.

“The prompt action by the Veterinary Services Department to stamp out the bird flu outbreak according to the protocol had been effective,” said agriculture minister Muhyiddin Yassin.

Following three months of surveillance and laboratory tests that have not shown any traces of the virus, the country had fulfilled conditions set by the World Organisation for Animal Health and has been “declared free from the disease”, Muhyiddin was quoted as saying by the official Bernama news agency.

Following the outbreak in June, a total of 4,226 chicken, ducks and other birds were culled, incurring a cost of 39,939 ringgit ($11,735) in compensation paid out to the livestock owners, he said.

Muhyiddin said the government was still taking preventive measures against the virus, such as conducting checks on poultry farms, prohibiting the import of chicken, ducks and other birds from countries affected by the disease and intensifying checks at border checkpoints to curb smuggling.

“The government has so far spent almost 10 million ringgit ($2.9 million dollars) in compensation to the affected poultry rearers.

“Almost 80,000 birds were culled since the first bird flu case was detected in 2004,” he said.

Following news of June’s outbreak, neighbouring Singapore stopped import of poultry and eggs from the affected area.


For More Info

Tuesday, July 8, 2008

Is agricultural biotechnology safe?


Many of us are concerned by the possible risks of agricultural biotechnology. For example, when you grow transgenic crops, can their modified genes alter wild varieties of similar wild plants? The latest issue of the California Agriculture magazine carries several articles focusing on transgenic crops, fish and animals. And some discoveries are alarming: "one of the world’s most important crops, sorghum, spontaneously hybridized with one of the world’s worst weeds, johnsongrass, even when they were grown up to 330 feet apart; furthermore, the two plants are distinct species with different numbers of chromosomes." Read more for selected excerpts of these three important research papers.

California Agriculture is a peer-reviewed journal reporting research, reviews and news from the Division of Agriculture and Natural Resources of the University of California. Its latest issue contains several articles about transgenic crops, fish and animals which show that "crop transgenes wander in the environment" and asks the question: "But is this is cause for worry?"

Here is a link to the abstracts of this July-September 2006 issue.

Let’s start with the paper about transgenic crops, "When crop transgenes wander in California, should we worry?" (PDF format, 10 pages, 1.08 MB). Here is the introduction of this article from Norman C. Ellstrand.

The movement of transgenes into populations for which they are not intended remains a primary concern for genetically engineered crops. Such gene flow in itself is not a risk. However, we know that the transfer of genes from traditionally improved crops into wild populations has already resulted, on occasion, in the evolution of weeds more difficult to control, as well as an increased extinction risk for rare species. Just like traditional crops, genetically engineered crops could occasionally create the same problems.

Before going further, do you have an idea of what kind of genetically enhanced products exist today? Below is a picture that will show you that there are many food products genetically engineered (Credit: Stephen Ausmus, USDA-ARS).

For More Info

Saturday, July 5, 2008

Biotechnology: The Invisible Revolution







Jado initiates phase II trails of lead RAFT modulator in urticaria

04 Jul 2008 - JADO Technologies GmbH, a developer of RAFT intervention therapeutics, announced the start of a Phase II study to evaluate the safety and efficacy of oral TF002, a formulation of miltefosine, in patients with antihistamine resistant urticaria. TF002 exerts anti-inflammatory activity via RAFT modulation. RAFTs are sub-compartments in the lipid membrane of cells that play a role in the complex physiological processes, such as immune and inflammatory response.
Anzeige


"The start of this systemic trial continues JADO's product development strategy in allergy. Having identified that miltefosine acts via a RAFT mechanism, we want to ensure that we continue to capture the value of the product in the allergy field with our formulations," noted Charl van Zyl, CEO of JADO. "We have made excellent progress with our studies and believe data from this study will continue to support our multi-faceted development program for TF002."

The randomized, double-blind, placebo-controlled study will enroll a total of 75 patients in 7 German centers. The primary end-point of the study is urticaria symptoms assessed by urticaria activity score (UAS) at the end of treatment.

JADO is investigating topical and oral versions of TF002 in Phase II trials in several allergy indications. Proof of concept with the topical formulation in a Phase II study of atopic dermatitis has been achieved. The company and academic collaborators also recently published in Science proof of concept for its Alzheimer's RAFT inhibitor program.

For More Info click Here

Biotechnology & Gene Therapy







Wednesday, July 2, 2008

Intel says to prepare for 'thousands of cores


Intel is telling software developers to start thinking about not just tens but thousands of processing cores.
Intel currently offers quad-core processors and is expected to bring out a Nehalem processor in the fourth quarter that uses as many as eight cores.

But the chipmaker is now thinking well beyond the traditional processor in a PC or server. Jerry Bautista, the co-director of the Tera-scale Computing Research Program at Intel, recently said that in a graphics-intensive environment the more cores Intel can build the better. "The more cores we have the better. Provided that we can supply memory bandwidth to the device."

On Monday, an Intel engineer took this a step further. Writing in a blog, Anwar Ghuloum, a principal engineer with Intel's Microprocessor Technology Lab, said: "Ultimately, the advice I'll offer is that...developers should start thinking about tens, hundreds, and thousands of cores now."

He said that Intel faces a challenge in "explaining how to tap into this performance." He continues: "Sometimes, the developers are trying to do the minimal amount of work they need to do to tap dual- and quad-core performance...I suppose this was the branch most discussions took a couple of years ago."

For More Info

Google, Yahoo spiders can now crawl through Flash sites


As anyone who has had the pleasure of doing web design and development through marketing agencies knows, Flash tends to be wildly popular among clients and wildly unpopular among, well, pretty much everyone else. Part of the reason for this is because Flash is so inherently un-Googleable; anything that goes into a Flash-only site is basically invisible to search engines and therefore, the world. That will no longer be the case, however, as Adobe announced today that it has teamed up with Google and Yahoo to make Flash files indexable by search engines.

This announcement has been a long time coming, as Flash developers have been wishing for ways to make their content searchable for close to a decade. Adobe acknowledges this in its announcement, saying that although search engines are able to index static text and links within Flash SWF files, "[Rich Internet Applications] and dynamic Web content have been generally difficult to fully expose to search engines because of their changing states—a problem also inherent in other RIA technologies."

This announcement may also result in some major usability changes (for the better) for Flash on the web. In a post to its Webmaster Central Blog, Google wrote that it can now index all kinds of textual content in SWF files, like that included in Flash gadgets, buttons, menus, entirely self-contained Flash web sites, "and everything in between." Google can now also follow URLs embedded within Flash files to add to the crawling pipeline. This new indexing technology does not, however, include FLV files (video files that are found on sites like YouTube) because those are generated as videos and don't contain any text elements like an SWF file does.

For More Info

Which Linux Distributions Are Dying?


I just read Louis Gray’s post titled “On the Web, If You’re Not Growing, You’re Dying.” It gave me a chilling realization about web services. Like everything else, what goes up must come down. This must apply to Linux distributions too, right? So, what’s happening with Linux? Which distributions are growing? Like Louis Gray, I’m going to use data from Google Trends. People searching the name of Linux distributions on Google can be considered new users. After all, wouldn’t experienced Linux users already know where the websites of the big Linux distributions are?

So, what does this tell us? First of all, Ubuntu is pretty close to being considered the face of Linux. Second, it’s the newer distributions like Ubuntu, OpenSUSE, and Fedora that new Linux users are going for. Of course, it’s mainly Ubuntu, but I believe that there could be plenty of new users arrinving at the Fedora and OpenSUSE communities if both distributions work hard to become more user friendly.

So? What’s going to happen to the distributions like Debian and Slackware? I’ll leave that to you.

i80and’s EDIT: While it is true that Ubuntu is increasingly becoming Linux to many people, DistroWatch.com shows that not all is doom and gloom for the “classic” distros; Slackware had been on a rise for the past 6 months as of 07/01/08, as has Debian. However, broadening the statistic query to the past 12 months unsurprisingly shows a generally more neutral growth, with Debian still gaining H.P.D (Hits Per Day) whereas Slackware falls.

For More Info

Tuesday, July 1, 2008

The Cookie-Cutter Image Hover Effect


Today I am going to share an interesting and super-cool way to create a dynamic hover effect for images. I call this flexible format the "cookie-cutter effect", and you'll find out why in a moment. First, I shall go off on a tangent to maintain my reputation. This day in age, alot of navigational bars on web sites are image-based. Designers prefer to use graphics because it helps make the navigation more stunning and visually appealing (eye candy). Another thing that is becoming nearly mandatory is a hover effect for each of the links, meaning the image changes slightly when the user has positioned their cursor over it. This sends an instant visual cue to the user and lets them know exactly what they are pointing at. It may seem a little silly to think a user does not know where their mouse is pointed, but that certainly is not the case. The map in the center of your local shopping mall with a big arrow that says "You are here" is not silly either, it is actually helpful to some. Unfortunately, there is a pretty durastic downfall to using graphics for key elements such as navigation: they seriously lack extensibility. Say you want to increase the brightness of the hover effect by a few shades. This task, at the very least, requires knowledge of a graphics editing program. Usually some design talent is required too. In many cases this situation cannot be avoided, but the approach that I am discussing in this post will at the very least allow greater flexibility than the norm.
Let's Begin

Demo Image 1To demonstrate the technique, I went ahead and created some random image with my first name next to an odd shape. I stuck the shape in there to illustrate that this approach is applicable to more than just textual images. In the end, the main part of this image (currently black) will be able to be changed to any color we want using CSS only. Consider the base image to the right as the opposite of what we want. The plan is to turn the text into the transparent part, and the background of the the image to a solid white. The result will be somewhat of a template, if you will. We will be able to use any background color or image behind it to essentially alter the part of the image that is actually intended to represent the foreground. I became aware of the usefulness of this concept when I read Jason Santa Maria's post the other day about his new blogging expedition. He intends for each post to have its own design theme/color scheme. Currently, he has the inverse effect of what I am demonstrating in this post; the background color of his navigational links change. This was accomplished by using normal transparent images like the one above, and is still flexible and cool. But, what if you want the color of the text to change? I'm sure Jason knows the answer.

For More Info

Major Update to Weave Prototype, 0.2 Development Milestone


As the Web continues to evolve, and more of our lives move online, we believe that web browsers like Firefox can and should do more to broker rich experiences while increasing user control over their data and personal information.

Weave is Mozilla Labs’ project to develop a coherent framework and platform for deeply integrating online services with the browser. Our goals are to enhance the Firefox user experience, increase user control over their personal information, and provide new opportunities for developers to build innovative online experiences.

Today we’re releasing an update to the core data synchronization components of Weave in preparation for the introduction of data sharing and third-party APIs.
Major Updates and Features

* Significant reworking of the installation and setup experience.
* Support for major browser data types, including bookmarks, browsing history, cookies, saved passwords, saved form data, and tabs.
* Intelligent scheduler for determining when to synchronize data between browser and server to improve performance.
* RSA public/private keys and AES encryption of all user data on the client side through NSS, the crypto library used by Firefox.
* End-to-end encryption, with initial support for secure sharing of data with a 3rd party and with XMPP-based notifications of shares.


For More Info