Software survivor celebrates middle age with facelift

File transfer veteran Laplink has taken advantage of the impending demise of Windows 7 to remind those faced with a migration challenge that it still exists.

It is hard to not to feel a twinge of nostalgia at the word “Laplink” as one recalls the serial cables and floppy disks of three decades ago used to migrate data from one PC to another in the pre-connected world.

Or even the breathtaking performance of that parallel cable.

Microsoft does not publish a list of what is compatible with Windows 10…

Of course, Laplink and the world have moved on, and the company this week pulled the covers from a substantial update to its PCmover Enterprise tool. While those brightly coloured cables of 30 years ago are long gone (although the company will sell you something more USB or Ethernet-flavoured if you must), shunting data from PC to PC remains a thing, even though the likes of Microsoft insist that everyone should be in the cloud by now.

With a refreshed interface, the company reckons IT admins can allow end users to perform their own migrations (with some controls applied, naturally) from old and busted Windows 7 to new and shiny Windows 10. The now-included Profile Migrator will also transfer files and settings between user profiles on the same PC as well as shunting on-premises Active Directory profiles to something a little cloudier.

And, of course, PCmover will still migrate data between PCs without users needing to fiddle with USB sticks or drag and drop folders around.

Company CEO Thomas Koll told The Register that the enterprise was the target this time around, with the user interface broken away from the actual file transfer engine “making even more customisation possible for the enterprise”.

Adding that “the enterprise is not fully in the cloud”, Koll reckoned that the need to migrate data rapidly in “an inexpensive way” was very much a thing as Windows 10 deployments continue.

Of course, things have moved on from the halcyon days of DOS, and transferring some apps can present a challenge. “Microsoft does not publish a list of what is compatible with Windows 10,” Koll noted, and there is, of course, the challenge of ensuring all dependencies make it over as well as issues around 32 and 64-bit apps.

We wondered what use a well-run enterprise IT department would have for this tool. After all, surely a standardised Windows 10 image, roaming profiles and data resolutely stored on a server would make Laplink’s PCmover redundant?

Not so, according to Koll, who cited case studies where over 100,000 migrations were performed because, well, people do love keeping things stashed locally.

Laplink remains an impressive tool after all these years. However, we still can’t help but regret that the days of a PFY traipsing around the office armed with only a multi-headed blue cable and a disk are very much a thing of the past. ®

Sponsored: What next after Netezza?

Former Cambridge Analytica employee Brittany Kaiser, recognisable to many as the unlikely star of the Netflix documentary The Great Hack, has appeared at Web Summit in Lisbon.

The documentary followed the self-styled whistleblower as she testified to the UK parliament about what she knew when she worked at the firm as a business development manager.

Now with a book out, she has reinvented herself as a data privacy guru aiming to educate youngsters about disinformation, and planning to put data back into the hands of users via blockchain technology.

The Cambridge Analytica scandal broke in 2016 when it emerged that the data of up to 87 million Facebook users had been harvested via a personality quiz – and it’s never been exactly clear how it was used.

The consultancy aided Donald Trump’s election campaign. And Ms Kaiser appeared on the firm’s behalf at a Leave.EU Brexit press briefing – the two organisations say they never signed a contract to work together but Ms Kaiser has alleged that “chargeable work was done”.

In an interview with the BBC, Ms Kaiser said she wanted to see political advertising on Facebook banned.

She said she feared little had changed. Hundreds of companies around the world are still crunching through personal data and throwing it back at people in the form of political ads, she said.

“It is sad that we have to ban all forms of political advertising to stop people being manipulated. But it has to be done,” she added.

“Our electoral laws are not fit for purpose. Facebook functions pretty much the same and now it is not going to ban any politicians who are sending disinformation our way.”

While Twitter has moved to ban political advertising, Facebook has not, and she thinks it will need government regulation to force it to.

It’s important, she said, because of the way data is being “weaponised” in political campaigning.

“Data-driven campaigning gives you the edge that you need to convince swing votes one way or the other, and also to get certain people to show up to the polls,” she said.

“It can also be used to turn off your opponents and get people not to show up to the polls.”

In her book Targeted, she provides new details about the methods she claims were used by Cambridge Analytica in the US presidential election, in particular how it gathered information on different personality types and sent them adverts most likely to resonate with them. The use of so-called psychographics in the Trump campaign had been denied by the firm before its collapse.

“What I saw when I was at Cambridge Analytica was that individuals were deemed persuadable, I don’t mean persuadable to vote for Donald Trump, but persuadable to not vote for Hillary Clinton,” she told the BBC.

“So it was to deter them from going to the polls. And that is the type of tactics where you can use this information in order to persuade certain people to disengage from the political process.”

She gave specific examples of her claims: “We saw an old quote from Michelle Obama being turned into an advertisement that made it look like she was criticising Hillary for staying with her husband, who cheated on her, and that was being targeted at conservative women to get them to not support her.”

An old 1996 speech by Hillary Clinton in which she talks about young black men joining gangs was targeted at African Americans, she said, to dissuade them from voting for the Democrat.

Ms Kaiser said the personality profiling done by Cambridge Analytica was good at targeting “people who are neurotic, and sending them fear-based messaging”.

“Sending messages to people who were extroverted and open-minded wasn’t very effective,” she added.

Some regard Ms Kaiser as an unreliable witness, and question whether her whistleblowing was done more to save herself than to expose the company she worked for.

Prof David Carroll, a data privacy expert who also played a pivotal role in The Great Hack, told the BBC that he thought she was “an important witness to history”.

But he believes that in her book she “obfuscates and omits key aspects, to protect her reputation and her friends”.

In her BBC interview, Ms Kaiser addressed her critics: “Most of those people have no idea how hard it is to be a whistleblower.”

“I spent the past year and a half being unpaid, doing pro-bono work for governments around the world by being an expert witness… never knowing if I’d ever get a job again, never knowing if I was going to be persecuted or if I would be threatened with physical violence.

“You really start to wonder who’s going to come after you.”

It is all a long way from when she entered the world of politics and data, in the Obama campaign, to “figure out what got people excited about politics”.

“I never expected when I joined a company that was going to teach me more advanced forms of data science, that there was going to be anything malicious about it,” she said. “It’s never too late to decide to do the right thing.”

Critics question what she did during her time at Cambridge Analytica. She has been accused of deploying Israeli hackers to influence the presidential election in Nigeria in 2015, something she denies.

“In Nigeria I met with clients who are actually private businessmen, not the campaign itself, who wanted to fund an external campaign. And so I helped, put together a team and sent people out there. They were only there for three weeks, so nothing that they did was really that big or that effective,” she told the BBC.

And what of Alexander Nix, her former employer with whom she is portrayed as having an affectionate friendship in the Great Hack?

She told the BBC she was no longer in contact with him – in fact a text message wishing her luck in her testimony to the UK parliament in 2016 was the last time she heard from him, she said.

But she said she believes he is still involved in political consultancy work.

“I hope he has learned from his mistakes and is working more ethically.”

Listen to more of the BBC’s interview with Brittany Kaiser on the BBC World Service’s Business Daily.

‘We want to break down barriers, move ideas seamlessly across applications, across people, across devices’

Ignite Microsoft is previewing its Fluid Framework, first announced at its Build developer event in May, and presenting it as a key technology for content-based collaboration.

Fluid Framework only occupied a brief and minor slot in CEO Satya Nadella’s keynote at Redmond’s Ignite conference, running this week in Florida, yet the forthcoming preview, expected around a month from now, is one of the most intriguing and potentially significant technologies featured at the event.

Microsoft’s Office 365 is cloudy collaboration with its roots in desktop applications, especially the familiar trio of Word, Excel, and PowerPoint. Documents get created in desktop Office and emailed back and forth, or perhaps sent as SharePoint links, in a manner that has not radically changed for 25 years. Google Docs, born on the web, is in some respects more advanced since it is browser-centric, better at collaboration, and conceptually treats content more like web pages than files, although users are still invited to create a document, a spreadsheet, or a presentation.

Fluid Framework is Microsoft’s attempt to change the way people collaborate and update content in Office 365. At a high level it is two things. First, a high-performance synchronization technology that is fast enough that it feels instant.

‘People feel like they’re all on the same device’

“Real time collaboration is pretty standard,” Rob Howard, general manager of Microsoft 365 Foundations, told The Register at Ignite.

“There is a point where we see people behaving differently: when you get to 10, 15, 20 millisecond latencies, as opposed to a few 100, people feel like they’re all on the same device.”

The claim is that this technology scales to hundreds of concurrent authors. Those contributors may, in some cases, be bots rather than humans, enabling scenarios like simultaneous translation into multiple languages.

“The notion of having an object that lives on one person’s machine and synchronizing it with an object that sits on another person’s machine is not a new notion,” Howard continued. “We think we’re applying it in a new way. We’re using an eventually consistent model in order to have this high-scale experience that lets people distribute those objects across browsers, across experiences, across sessions. We think that is new.”

Second, the framework is a componentized document model. In the keynote example, a PowerPoint chart is based on a table of data, and that table is copied, or rather shared, with a mobile application, such as Teams. A user updates the data on the mobile device and the PowerPoint chart instantly updates. It is not PowerPoint running on the mobile device, just the table component surfaced in a different application.

The Fluid Framework example in the keynote showed a PowerPoint chart updated in real time from a table in another application on another device

“Fluid Framework lets us decompose a document into paragraphs and individual components, take a table, paste it into another application, like Outlook or Teams, so that people can collaborate in the context of where they are, and continue to do their own work,” said Howard.

Componentized, or compound documents, will bring back memories for some – like, perhaps, Object Linking and Embedding (OLE), introduced by Microsoft in the 1990s. OLE 2.0 allowed one document to be embedded in another, such as an Excel spreadsheet in a Word document, with the link maintained so that when the spreadsheet was updated, so did the Word document. Then there was Office Binder, part of Office 95, 97, and 2000, which enabled users to combine multiple document types into one.

In practice it was clunky and not much used. OLE 2.0 was famous for bringing Windows to its knees in the early days. Another early and doomed example of compound documents was Apple’s OpenDoc. What do compound documents look like in the internet era? Fluid Framework may be Microsoft’s answer.

Fluid Framework will initially be a feature of Office 365. The preview takes two forms. There will be a customer preview here. A business Office 365 subscription will be required. It will enable users to create componentized documents, do real-time collaboration, and try cross-application features, though this will be simplified compared to the final release. Components available will include tables, text with basic markdown styling, checklists, tables that track simple actions with users assigned as owners, @mentions that notify users, and a date picker.

The preview may be for browser-based access only. Next in line are the Office mobile applications, and desktop Office maybe, eventually. This is web technology and reverses the usual expectation that desktop is the “full Office”, especially on Windows.


Hey, corporate types. Microsoft would really love to pick your brains about Project Cortex


The second form of the preview is private and aimed at developers. You can sign up here. “We think the technology is broadly useful for collaborative web experiences. We want to work with developers to understand the scenarios they’d like to apply it to,” said Howard.

If multi-authoring documents at scale sounds like a recipe for chaos, it is mitigated by comprehensive change tracking. “It’s part of the eventual consistency model,” said Howard. “You could imagine being able to do things like scrub the changes back and forth or create a keyframe of the document that you could go back to. You can even do things like branch and fork documents.” These concepts feel natural for developers, but may not be intuitive for other Office users. “From a user experience perspective this is something we have to figure out,” Howard remarked.

Microsoft, said he, plans to “bring this technology through all applications in Microsoft 365. This is foundational.” The intention is “to break down the barriers between those apps so you can move ideas seamlessly across applications, across people, across devices.”

The Fluid Framework project may fall flat. The concept of live content that can change before your eyes is often not appropriate for business documents. Performance may fall short of Microsoft’s claims. Users may be reluctant to break the habit of creating static content in desktop Office and sending it here and there for comment and revision. SharePoint was intended in part to break that habit and has only been partially successful.

All that granted, the Fluid Framework is an intriguing new take on how business documents work, and lets go of the idea that desktop Office is the best way to create them. ®

Sponsored: What next after Netezza?

How IT titan’s ‘Incentive Plan Letter’ anti-contract is starting to unravel

Analysis A 17-year veteran of IBM is suing the American giant claiming it failed to pay him promised sales commissions – a charge made by other Big Blue salespeople in at least two dozen similar lawsuits over the past fifteen years.

Mark Comin, of San Rafael, California, who worked for IBM from 2001 through 2018, filed his complaint in a US District Court in San Francisco on Monday alleging Big Blue violated the Golden State’s unfair competition law for knowingly refusing to provide a valid, written contract and refusing to pay earned commissions.

The lawsuit [PDF] states that IBM employs hundreds, if not thousands, of sales reps and managers in California who earn sales commissions yet does not provide employees with a formal binding contract guaranteeing what they will be paid.

Instead, it provides sales reps with an Incentive Plan Letter (IPL) that includes a disclaimer stating that the letter is not a contract to pay commissions. IBM thus does not have a contract with its salespeople and, excluding agreed upon salary, can pay them… whatever.


According to his complaint, Comin helped close a $6.4m deal with engineering firm AECOM and was owed a portion of $925,000 for his efforts. IBM paid him nothing. There was also a $3.375m deal to license third-party product Carbon Black Response, via IBM, to Intel. Comin received 10 per cent of the expected commission. He also helped sell a product called IBM Security Guardium to Apple for $1.15m. The deal closed in 2017 and Comin should have received a commission, but didn’t get anything. Fed up, he left the company in February 2018.

Because Big Blue has failed to pay salespeople commissions based on quotas in the IPL, a number of IBM workers have sued the company. But most have not been successful.

“Each time, IBM’s defense has been the same: IBM owes nothing because the employees do not have an enforceable contract for the payment of commissions,” Comin’s complaint states. “IBM claims that the IPL is not an enforceable contract, nor is there any other enforceable contract.”

IBM owes nothing because the employees do not have an enforceable contract for the payment of commissions

California is an “at-will” employment state, meaning bosses can fire workers without showing cause. IBM is effectively advancing the idea of “at-will” payments for sales commissions, because in the absence of a valid contract, it contends it can cap payments as it sees fit.

Matthew E. Lee, a partner at Whitfield Bryson & Mason LLP and one of the attorneys representing Comin, told The Register IBM has pushed this line of argument in court successfully for years.

An IPL may seem to people to be a contract, he said, but it contains language that says it’s not a contract. And so previous breach of contract claims have been thrown out by the courts because IBM salespeople don’t actually have a contract that could be breached.

“That happened enough times that legislators and states started passing laws,” Lee explained.

A change to the California Labor Code in 2013 required employers in the state to provide commission-earning employees with a valid contract.

“That’s obviously a problem for IBM,” said Lee.

“This is a have-your-cake-and-eat-it-too situation,” he said, noting that companies like IBM benefit from the unusual motivation arising from the promise of uncapped commissions, “particularly in software sales because these are huge deals.”

You’ve got (Ginni’s) mail! Judge orders IBM to cough up CEO, execs’ internal memos in age-discrim legal battle


Commission litigation has come up for Oracle sales people, too.

And HP/HPE in January settled a commission underpayment lawsuit for $25m.

Lee said there’s a disconnect at IBM between internal marketing efforts designed to incentivize sales and the IT titan’s subsequent litigation strategy when they decide the commission is too much and they don’t want to pay.

Lee said his firm has handled eight or nine of these cases, and things have finally started to go against IBM. A few weeks ago, he said, his team got its first denial of summary judgment – meaning IBM’s efforts to get the complaint thrown out failed.

That occurred in a lawsuit filed last year on behalf of David Swafford, who says IBM failed to pay about $250,000 in owed commissions for sales to Oracle and Sabre. The judge in that case, Lucy Koh, found Swafford’s claim that IBM committed fraud to be credible enough that the case should be allowed to proceed.

Koh’s partial denial of IBM’s motion to dismiss notes that there’s evidence to support Swafford’s contention that IBM wants salespeople to believe their commissions are uncapped because otherwise the corporation would be unable to recruit good sales representatives and would lose deals.

Lee said he’s been taking depositions for another IBM case in San Francisco in which he interviewed the head of IBM’s commissions program. After asking her about the capping of commissions, he said, “I asked her whether employees could trust IBM. She told me, ‘I don’t know.’ That’s incredible to me.”

IBM did not respond to a request for comment. ®

Sponsored: What next after Netezza?

Regulator OKs Sprint-T-Mobile US merger because… 5G? Higher prices?

Analysis The FCC today ushered in a new era of reduced competition in America’s mobile market by approving a merger between the third and fourth largest operators, T-Mobile US and Sprint.

Exhausted over having to choose between different mobile operators, folks in the Land of the Free will now only have to pick one of three: AT&T, Verizon or T-Mobint. And, according to the FCC, the decision will both “close the digital divide” and “promote wide deployment of 5G services.”

No one apart from the three FCC commissioners that approved the deal can see how the deal is anything but a dangerous reduction in commercial rivalry that will lead to higher prices and greater profits at the expense of subscribers. The regulator’s supremo and former Verizon executive Ajit Pai appears convinced otherwise.

He is even certain that reducing competition in the mobile industry will “enhance competition in the broadband market,” noting [PDF] that “with New T-Mobile’s network, 90 per cent of Americans would have access to mobile broadband service at speeds of at least 100 Mbps, and 99 per cent would have access to speeds of at least 50 Mbps.”

Less competition is in fact more competition, according to Pai: “New T-Mobile will be far better positioned to deploy Sprint’s extensive 2.5 GHz spectrum holdings than would Sprint standing alone, given that company’s financial situation.” And in case that’s not clear enough, he adds: “So let’s be clear: A vote against this transaction is a vote against strong, swift mid-band 5G deployment.”

It makes you wonder what the point is of having three mobile telcos: why not just have one and really maximize that 5G deployment? Imagine if a cellular network, say Verizon, didn’t have to waste all that money advertising its services because citizens only had one provider to go to. Plus there’d be no need to spend so much money buying up spectrum. Just imagine how much better things would be.

Not entirely unanimous

But despite Pai’s impeccable logic, there were two votes against strong, swift mid-band 5G deployment – by two of his fellow commissioners who feel somewhat differently about the merger approval.

Commissioner Jessica Rosenworcel put out a 14-page dissent [PDF] in which she pointed out that “three companies will control 99 percent of the wireless market. By any metric, this transaction will raise prices, lower quality, and slow innovation, just as we start to deploy the next-generation of wireless technology.”

Gather round, friends. Listen close. It’s time to list the five biggest lies about 5G


She had some harsh words for her fellow commissioners, who she said had been “wooed by a few unenforceable concessions and hollow promises from the two companies involved.” She railed: “So many people already think that Washington is rigged against them. It saddens me when on too many occasions this agency proved them right.”

As with many other recent decisions at the federal regulator, she notes that the decision was made long before the details were even considered: “Three of my colleagues agreed to this transaction months ago without having any legal, engineering, or economic analysis from the agency before us. They agreed to this transaction before the Department of Justice could finish its review, ending a longstanding practice of coordinating efforts between the agencies. Consumers deserve better from Washington authorities charged with reviewing this transaction.”

While Pai paints a glorious picture of universal broadband and 5G access, Rosenworcel sees a “cozy oligopoly,” and “an end to the competitive rivalry that reduced prices by 28 percent during the last decade.” Unlimited data plans, free international roaming and offers to pay termination fees will come under pressure, she predicts. And 5G? It will “proceed more slowly and yield fewer jobs without the fuel of competitive pressure.”

She goes on: “Both the FCC and Department of Justice should know better… So many of America’s most pressing economic and political problems can be traced back to this kind of market consolidation… with less competition, rates rise and innovation falls.”

And now the good news

Rosenworcel has much more to say in an in-depth breakdown of the deal but concludes: “I dissent to the FCC’s decision to consolidate the wireless market in the hands of three companies. I dissent to the process the agency used to reach this result, which hid too much of the negotiations and this decision out of view from the public. And I dissent to the remedies the FCC adopts that gamble our 5G future on a new wireless entrant and put all the risk of this merger on the backs of American consumers.”

Aside from FCC commissioners getting furious with one another, what can folks here really expect to see? Well, significant price increases for one – that’s right, the FCC’s own approval notes that the deal “would likely lead to significant price increases.” So that’s good.

Anything else? Well, yes, the US Department of Justice was opposed to the deal and said in an official report that four competitors were required to ensure effective competition. But it removed its objections to the merger after some wrangling behind the scenes.

The FCC’s own expert staff are opposed to the merger, though their critical reports were “rewritten by the FCC’s political leadership behind closed doors.” And no less than 16 states have sued in an effort to block the merger.

And in that lawsuit lies the only remaining barrier to a less-competitive, more-expensive US mobile market. The bipartisan group of state attorneys general say that the merger will result in an “unacceptable loss of competition.”

The courts will have to decide whether to take their word for it and believe in the glorious picture painted by Ajit Pai and his cronies that – it feels churlish to point out – has no foundational basis in reality. ®

Sponsored: Your Guide to Becoming Truly Data-Driven with Unrivalled Data Analytics Performance

Watchdog agrees one day of profit ought to be enough after 5 years of arguing

Comment Toothless American consumer watchdog the Federal Trade Commission today agreed to let AT&T settle a five-year battle over phony “unlimited data” promises for just $60m. That’s $40m less than expected, and less than one day of annual profit for the telco giant.

The agreement [PDF] lets AT&T claim it did nothing wrong; the settlement, as ever, comes with no admittance of guilt. The FTC’s official announcement repeatedly stresses the fact that AT&T throttled millions of customers’ mobile broadband access despite selling them “unlimited data” plans, are mere “allegations.”

The settlement was agreed to by four of the five FTC commissioners, with Rebecca Kelly Slaughter recusing herself for some unknown reason. Another commissioner, Rohit Chopra, made it clear that he was not happy about the deal, despite agreeing to it.

In a statement [PDF], Chopra called AT&T’s actions a “scam,” “scandal,” and “massive fraud,” and noted he “would have liked to see AT&T pay more,” but recognized “the risks and resources associated with litigation.” He concluded: “The bottom line is that AT&T fleeced its customers to enrich its executives and its investors.”

The original FTC complaint was filed in 2014 after it received numerous gripes from AT&T customers that their phones’ mobile internet download speeds had slowed to a crawl despite being on “unlimited” data plans. It turned out that after people had fetched 2 to 3GB of information each month, AT&T simply throttled their accounts, making web browsing difficult and video streaming near-impossible.

But we never!

In its defense, AT&T, which last year banked $20bn in profit, said it had put out a press release – back in 2011 – noting that it was changing the terms of its unlimited data plan, and at some point had also noted the change on people’s bills. It also said it sent text messages to customers letting them know it was about to limit their data.

But both the FTC and US comms regulator FCC found that was wildly insufficient given the size and scope of the impact of the change: an estimated 3.5 million people were affected by the policy in the intervening years.

Commissioner Chopra also noted that AT&T’s customers were effectively captive: “AT&T baited subscribers with promises of unlimited data, trapped them in multi-year contracts with punishing termination fees, and then scammed them by choking off their access unless they moved to a more expensive plan,” he noted, adding: “The AT&T throttling scandal is an important case study into how dominant firms operating without meaningful competition can easily renege on their contractual obligations and cheat consumers who have almost no recourse.”

AT&T claimed it had done nothing wrong, prompting the FTC to sue it. AT&T lost the case in district court in 2014 so it appealed and somewhat incredibly, its defense wasn’t that the claims were incorrect but that the FTC itself had no authority over AT&T.

AT&T told the appeals court that since it is a “common carrier” under the law, only the FCC was allowed to fine it. And, amazingly, the appeals court, in 2016, agreed. But that created its own problem because by that point the FCC under new chairman, and former cellular industry executive, Ajit Pai had disowned its role in overseeing AT&T in this capacity as part of his ideological opposition to net neutrality.

Guilty! For now

The FCC had previously also found AT&T guilty of the same practice and fined it $100m for misleading its customers. AT&T played hardball with the FCC, claiming that rather than owe $100m, the fine should be just $16,000.

But then the FCC scrapped data privacy rules that were due to come into effect just days later, and tore up the rules that designated internet service providers as “common carriers.” With the court ruling that the FTC did not have the authority to punish the telco, and the FCC having discarded its responsibilities over telcos, suddenly telecoms giants had literally nobody to answer to when it came to how it treated people and their complaints.

And so both the FTC and FCC desperately appealed to the ninth circuit to hold a full “en banc” review of the decision. In February 2018, the full ninth circuit overturned its previous decision and said that, yes, in fact the FTC did have authority in this case and it could proceed with getting money out of AT&T over the data throttling.

The FTC provides no information about what has been going on since that decision 20 months ago but as part of the settlement AT&T is prohibited from making any representation about the speed or amount of its mobile data, including that it is “unlimited,” without disclosing any material restrictions on the speed or amount of data. The disclosures also “need to be prominent, not buried in fine print or hidden behind hyperlinks,” the FTC noted.

Getting your money

The $60 million will be put into a fund and used to give partial refunds to current and former customers who had signed up to the “unlimited” plan before 2011 and had been throttled by AT&T. If you’re still an AT&T subscriber, you should get a credit direct to your account while former customers will be sent a check, although how many of them will be living at the same address eight years later is a matter of debate.

In short, this case is an extraordinary indicator of just how broken the American system of regulation is. AT&T sold customers one product, then changed the rules and claimed it had done nothing wrong. When challenged, it then embarked on a massive five-year legal battle where it successfully questioned the very legitimacy of the federal regulator that is supposed to oversee it.

The only reason the FTC is able to levy the current fine is because the cable companies were so successful in undermining the other main federal regulator, the FCC, that the FCC threw away its responsibilities altogether and accidentally created a regulatory black hole.

And after all that, even with the $60m fine, AT&T has made a healthy profit from ripping off its own customers. It’s an absolute shambles. ®

Sponsored: Your Guide to Becoming Truly Data-Driven with Unrivalled Data Analytics Performance

The BOFH’s secret command centre discovered?

Buying a house is a major ordeal. You go from door to door, months zip past and something’s never quite right… then you find it, “the one”. A five-bedroom home in Pinner, Harrow, northwest London.

“I could see the girls growing up here,” your partner utters with tears in their eyes.

Indeed, the listing on Rightmove looks none too shabby on the surface. “This family haven is situated in a family-friendly location just approx. 1.7 miles [2.7km] from Eastcote Station and 1.1 miles [c 1.8km] to Pinner High Street,” the description burbles.

The accommodation briefly comprises a bright and spacious entrance hallway with doors leading to all rooms. The dining room enjoys a bay window adding a natural flow of light into the property. To the rear aspect is a reception room which benefits [from] a fire place, wooden floors and access to a conservatory. The warm and welcoming conservatory boasts double French doors opening out onto the large patio area. The stunning kitchen has a range of eye and base level units including integrated appliances, a breakfast bar, tiled flooring, spot lights, double doors leading to the conservatory and Velux windows allowing additional light to flood in.

Notwithstanding the odd use of “briefly” – a buyer would probably prefer the home to be “bright and spacious” on a permanent basis – and inanimate objects “enjoying” things, it’s the dream home! Right?

Who are we kidding? This is a Register bootnote – where dreams go to die.

Because not once does the cheerful text acknowledge the sheer amount of electrical sockets festooning interior walls of the house. Trypophobes and OCD sufferers should probably look away… now.

House with loads of sockets
House with loads of sockets
House with loads of sockets

Five fridges, one breaker? Pics: Andrew Pearce Pinner/Rightmove

And there’s more where that came from.

The bizarre setup – or killer app, depending on how you look at it – along with legions of downlights is probably why the house is on sale at a steal(?) for £1,350,000, a mere fraction of the yearly running cost.

One can’t help but wonder how the previous owners were using the property. An intel agency’s forward operating base? Illicit bit barn? The BOFH’s secret command centre?

Twitter wonks had a few ideas.

Maybe they just wanted to keep their options well and truly open as to where the TV could go.

We asked the good folk at Electrical Safety First for their thoughts on the setup, and they didn’t quite know what to make of it.

“One would imagine it would overload the fusebox, although this might be dealt with if the room was on a separate circuit. But if you used all the sockets it would most likely blow the circuit breaker/RCD on that circuit. (Difficult to say, as we don’t know what is being plugged in).

“Another thing to consider would be the number of holes in the walls created by putting in all those sockets – this could increase the risk of fire spreading, should one start.”

They concluded: “While it might be feasible to have all these sockets, we wouldn’t recommend it.

“As to what it is used for – no idea!”

Mercifully, in the bathrooms and kitchen sockets are noticeably absent and money spaffed on rigging the house to explode in the event of a short circuit could have been better spent on the loving hands of a landscaper for the garden.

We’re sure Reg readers could think of a way to harness all that raw power. Don’t forget to weigh in below – and happy house hunting! ®

Sponsored: Your Guide to Becoming Truly Data-Driven with Unrivalled Data Analytics Performance

MoD offshoot names winners who dipped into £2m anti-drone ideas pot o’ gold

The British government has funded 18 anti-drone projects as part of its £2m push to stop a repeat of the Gatwick drone fiasco of 2018 – including a friendly drone swarm that will employ “peregrine falcon attack strategies” to down errant unmanned flying things.

Among the ideas that have scooped up to £800,000 each in funding for further development are plans to use machine learning to train cameras and other sensors on what a small drone looks like to aid early detection, as well as direction-finding of 4G and 5G-controlled drones.

In addition, one plan includes “low risk methods of stopping drones through novel electronic defeat or interceptor solutions”. That is, jamming a rogue drone or flinging something into it to knock it out of the sky.

One wacky firm is working on a counter-drone swarm that will use “peregrine falcon attack strategies”.

Funded by the Defence and Security Accelerator (DASA), as we reported in April, the competition is intended to bulk out the British state’s ability to KO unwanted drones at will, whether they’re being flown near the country’s second-busiest airport, dropping drugs into prisons, flying over sports stadiums and livestreaming fixtures, or anything else naughty you can think of.

More seriously, the Armed Forces are increasingly worried about the threat from drones, as was demonstrated by the first ever landing of what the MoD now calls “aerial vehicle systems” aboard new aircraft carrier HMS Queen Elizabeth.

DASA’s David Lugton said in a canned quote: “The threat from UAS [unmanned aerial systems] has evolved rapidly and we are seeing the use of hostile improvised UAS threats in overseas theatres of operation. There is a similar problem in the UK with the malicious or accidental use of drones becoming a security challenge at events, affecting critical infrastructure and public establishments; including prisons and major UK airports.”

Around 90 bids were received for the DASA funding, said the organisation.

Among the successful bidders were defence multinationals BAE Systems, Northrop Grumman, Thales and MBDA, all with various similar proposals for radar and sensor systems intended to pick up small drones, as well as privatised British defence research establishment Qinetiq, which is working on an electromagnetic death ray “hard kill for disrupting the UAVs’ on-board electronics”.

Phase 2 of the competition begins next year, with the intent being to develop the 18 shortlisted ideas into something usable by military and police agencies alike. ®

Sponsored: How to Process, Wrangle, Analyze and Visualize your Data with Three Complementary Tools

Chinese tech giant Xiaomi has unveiled the world’s first mainstream handset to feature a 108 megapixel camera.

The extra high-resolution sensor was developed by Samsung, which has yet to feature it in its own products.

The firms say the benefit is that it delivers “extremely sharp photographs that are rich in detail”.

However, one early test of the tech indicates that its images contain more digital distortions than those produced by lower-resolution smartphones.

For now, the Mi CC9 Pro Premium has only been announced for the Chinese market, where the base model costs 2,799 yuan ($400; £310).

But Xiaomi has said it will use the same component in the Mi Note 10, which will be launched on Wednesday and sold more widely.

The firm is currently the world’s fourth-bestselling smartphone vendor, according to research firm Canalys, with a market share of 9.1%.

Its sales are rapidly growing in Europe and it has just announced its intention to expand into Japan in 2020.

Merged pixels

Until now, 100MP+ sensors have typically been the preserve of medium-format digital cameras, which can cost tens of thousands of pounds.

Trying to squeeze lots of resolution into a smaller smartphone component runs the risk of increasing cross-talk, a phenomenon where the electrical activity of one pixel spills into its neighbours, as they are packed so closely together. This results in digital noise in the final image.

In addition, since each pixel needs to be smaller than normal to fit into the same space, each receives less light, causing further problems in low-light conditions.

Samsung’s Isocell Plus sensor partly addresses these problems by being larger in size than most smartphone sensors.

But its key innovation is that its pixels are arranged in groups of four, with each set sharing the same colour filter to detect red, green or blue light.

By default, data from each group is merged together to mimic the behaviour of a larger pixel. This results in a 27 megapixel photo.

But if there is enough light, the user can override the function to obtain a 108MP image. This is obtained via a software algorithm that remaps the pixels to simulate what would have been recorded, had they been arranged in the normal pattern.

The design, however, is not without its issues.

“Images from the Mi CC9 Pro Premium Edition showed more artifacts than our other top-scoring phones,” said review site DXOMark, which was given early access to the new handset.

It added that the phone delivered “limited dynamic range compared to other top performers”, meaning it tends to capture less detail in the highlights and shadows.

Users must also bear in mind that the 108MP shots will take up much more storage than normal and require more processing power to edit.

However, the phone does also include other lower-resolution sensors on its rear for telephoto portrait, wide-angle landscape, and macro close-up shots – which helped DXOMark give it a high score.

Xiaomi previously announced it would use the 108MP sensor in the Mi Mix Alpha, which was unveiled in September.

But that handset was pitched as a luxury device with a 19,999 yuan ($2,856; £2,218) price tag, and is not due for release until December.

One expert said the inclusion of the camera in the mass market Mi CC9 Pro and Mi Note 10 should help the phones stand out.

“Mobile phone manufacturers will go to almost any length to turn people’s heads, and this enormous megapixel camera is one way of grabbing attention,” commented Ben Wood from the CCS Insight consultancy.

“That doesn’t necessarily mean that you’re always going to get the best picture in all conditions. But for many consumers, there’s a perception that the bigger the number, the better the product.”

RISC-V-based blueprints available for all to freely use

OpenTitan – an open-source blueprint for a Root of Trust (RoT) system-on-chip based on RISC-V and managed by a team in Cambridge, UK – was teased by Google along with several partners today.

Hardware RoT is a means of verifying the firmware and system software in a computing device has not been tampered with, enabling features such as secure boot. Hardware RoT can also verify the integrity and authenticity of software updates, and prevent a system from being rolled back to an earlier version with known vulnerabilities. It is the lowest-level security piece in a trustworthy system.

But can you trust the RoT itself? The goal of OpenTitan is to provide an open-source design for RoT silicon so that it is (as far as possible) open for inspection.

The OpenTitan SoC will use the RISC-V open-source CPU instruction set architecture, and will be managed by lowRISC, a nonprofit in Cambridge, which has “an open-source hardware roadmap in collaboration with Google and other industry partners,” we’re told.

Today’s announcement comes from Google, Western Digital, the ETH Zurich university, chip maker Nuvoton Technology, and friends.

The Apache 2.0-licensed OpenTitan SoC will include the lowRISC Ibex microprocessor design, cryptographic coprocessors, a hardware random-number generator, volatile and non-volatile storage, IO peripherals, and additional defensive mechanisms. It can be used in any kind of device, from servers and smartphones to Internet-of-Things gadgets.

Most of the elements involved in making the OpenTitan system on a chip are open source

The project founder and director is Dominic Rizzo, a Google Cloud engineer. He said OpenTitan has been underway for about two years, and that thanks to the involvement of the aforementioned partners, “almost exactly half of the contributions are coming from outside Google.”

According to Rizzo: “Current silicon roots of trust are highly proprietary wherein they claim security but you have to take that as a leap of faith and you can’t verify it for yourself. OpenTitan is the first open-source silicon root of trust.”

Rizzo said there will also be a certification process for implementers, and integration guidelines for users. A reference implementation will be built by lowRISC.

Who will use OpenTitan? The Titan name comes from the custom silicon Google uses to secure its servers in its data centres, and according to the team, OpenTitan uses “key learnings from designing Google’s Titan chips.”

Asked whether it would shift to OpenTitan for servers or Pixel devices, the web giant told The Register “we don’t have anything to share about future product plans for Google.” Given the Chocolate Factory’s sponsorship of the project, it would be reasonable to speculate along those lines.

One of Google’s goals is to persuade us of the security of its own systems. Western Digital said it “is working with ecosystem partners to optimize the OpenTitan framework to meet the diverse security demands of data-centric storage use cases from the core to the edge.” ®

Sponsored: What next after Netezza?