Share on Twitter.
Get the most reliable SMTP service for your business. You wished you got it sooner!
The most dependable Ubuntu Linux dedicated server for less than $3 a day!
September 27, 2013
The good ol' days when a web artist or site designer could count on a big 24-inch screen with just 72 pixels per inch are long over.
So-called retina displays on small laptops and tiny smartphone screens for years have greatly complicated the job of delivering images
best suited for different browsers and various screen sizes.
But after several years of efforts and fine-tuning, it seems now that browser makers have finally settled on some kind of solution-- a
tweak to the Web page programming that goes by the weird name of "srcset" technology.
People using the internet should see high-resolution photos on Retina displays but not have to suffer delays when downloading needlessly
bulky images to a phone's browser-- of course that's if the proposed plan works.
And they'd be more likely to get the right images in the first place, because Web developers wouldn't have to spend so much time
creating variations of Web pages for different devices either.
Finally, the first concrete step began just last month, when WebKit, the open-source project behind Apple's Safari browser, adopted
the srcset attribute.
Google's Chrome browser, whose Blink engine is a WebKit offshoot, also followed suit by adding support last week. And yesterday,
Mozilla's Marcos Caceres said it plans to add srcset support hopefully before the end of the year.
"For now, it seems pretty clear that srcset technology will happen," said Guy Podjarny, a Web performance researcher for Akamai
who attended a September 10 meeting of the Responsive Images Community Group (RICG) that's unoficially managing the problem.
Srcset is vying with another approach, the picture element that is. Where srcset modified the image element, an existing part of the
HTML standard that governs how images are shown on Web pages, the picture element would have introduced an entirely new item to the Hypertext
Web developer and community group chair Mat Marquis endorsed the picture element last year, but "it was practically sneered at" by
browser makers, including Chrome, Podjarny said.
Ilya Grigorik, a Chrome developer at Google, believes "srcset isn't perfect, and it may not address every single wishlist item from the
RICG group, but it's good enough, at least for now."
Because Opera shares the same code base as Chrome, the only major browser maker not on board is Microsoft, which in recent years has
made major headway catching up to other browsers' support for newer Web standards. We contacted Microsoft for comment and will update this
post with its response.
Israel Hilerio, the principal program manager for Internet Explorer, showed "a strong and obvious interest" in the issue, according
to an account by Web developer Flo Preynat, but apparently didn't commit to any changes.
Grigorik said he is optimistic that Microsoft will come in on the side of srcset. In any event, the company told meeting members
that it's evaluating the issue now.
"Microsoft doesn't have any official plans, but indicated that they're currently planning their roadmap, and hence their presence
at the meetup to figure out where everyone is headed," Grigorik said in his own account.
Srcset support likely will start with one facet of srcset, a feature that deals with the complex problem of device pixel ratios (DPR)
that helps software cope with the differences between physical pixels on a device and the grid of virtual pixels used for layout purposes.
In the old days, physical pixels were the only sort, but that's no longer true in the "HiDPI" era -- where for example Apple's
Retina displays use four physical pixels to each virtual one.
Grigorik believes srcset will spread from there to address more challenges of optimizing imagery on the Web. "At this point, DPR-switching
is the non-controversial subset of the larger 'responsive images' discussion and srcset has the critical momentum to address it," Grigorik said.
"I would go as far as to say that all the vendors are aligned, and it's just matter of writing the necessary code," he added.
In other technology news
When two very opposing sides can't even agree on what the word tracking means, it isn't surprising that no progress at all
has been made toward launching a single browser button that prevents advertisers in the U.S. from tracking your online behavior
and the sites you visit. Almost two years after the Whitehouse, several digital advertisers, all browser makers and many privacy advocates agreed in principle
to create a "Do Not Track" mechanism for Web browsing, the various parties still haven't agreed on a very basic framework for the software
and what it's supposed to do for the average consumer.
In a desperate effort to salvage the Do Not Track initiative, several leaders of the working group spearheading the agreement put
their collective force together yesterday, and in a very concerted way.
The World Wide Web Consortium (W3C), which is moderating this increasingly divided and seemingly endless debate, said that it has
chosen a basic foundation upon which all other discussions about the Do Not Track project will be based.
It isn't the first time that the W3C faces a brick wall but this time, some of it's leaders intend to make sure this proposal goes through.
All the W3C needs to do now is hammer out no less than twenty-five major disagreements about that foundation's base text, including
what kinds of data can be collected, how advertisers can identify users, how users' information is stored, and the very definition of the
Choosing a framework might technically constitute a step forward for Do Not Track, but the reality is that the besieged tool is
no closer to seeing the light of day.
"And the parties are now further apart on the negotiations than they ever have been in the past," said Jonathan Mayer, a Stanford privacy
researcher and Do Not Track technology developer who is very involved in the negotiations.
The groundwork that the W3C chose was based on a preliminary draft that the organization's Tracking Protection Working Group cooked
up last month.
Both privacy advocates and digital advertisers raised several objections to the draft, but it was considered the working agreement until
the Digital Advertising Alliance (DAA), an industry trade group, proposed sweeping changes to the draft late in June.
In its ruling yesterday, the W3C said that the advertisers' proposal muddied the already well-murked waters about what tracking
would entail. Under the DAA proposal, advertisers would still be able to profile users and target advertisements to them -- even
if those users had turned on the Do Not Track feature in their browsers.
And that fails to meet the widely understood meaning of Do Not Track, the W3C said. "There would be widespread confusion if consumers
select a Do Not Track option, only to have targeting and collection continue unchanged," wrote Matthias Schunter and Peter Swire, co-chairs
of the tracking protection working group.
The DAA released a written statement yesterday saying that its proposal was "true to our 2012 White House agreement, and provides
a real choice and a valid option to consumers."
The W3C said it won't entertain any more discussion over the base text except for "polishing and editing" of the language. Understandably, some
working group participants aren't optimistic about where things are headed.
"I don't see a credible path forward for the group," Mayer said. "The June draft goes too far, according to DAA, but the privacy
advocates say it doesn't do anything we want Do Not Track to do."
The group is supposed to come forward with its final proposal at the end of July, but no one realistically expects that to happen,
at least not in the short term.
More problematically, the sides with the real power here -- the DAA and the browser makers -- can simply choose to ignore the
W3C's standard if they don't agree with it, and that's a very big issue, one that will most likely come back to haunt us one day.
And let's be very clear here: two of the largest browser makers, Google and Microsoft just happen to control two of the largest
advertising networks, so there is a very serious problem that needs to be adressed rapidly.
But for now, the Do Not Track project is dead in the water, and some browsers are coming out with their own solutions, whether good or not.
Mozilla, for example, said it will implement a feature in Firefox that blocks all third-party cookies. That would effectively block
most tracking, and could be seen as a step in the right direction.
Still, the debate promises to rage on for several more months until all parties involved can come to a mutually binding proposal
that is fair for everybody. Just don't hold your breadth too long, this could drag on for many additional months before a draft resolution
is made, reviewed and accepted by the W3C and the Whitehouse.
In other tech news
Scientists still aren’t exactly certain just when the deep-space Voyager probe will cross the line into interstellar space,
but new data from the spacecraft makes them believe it could be soon.
Voyager 1 and its twin, Voyager 2, were launched in 1977 to visit the giant gas planets, beaming back dazzling postcards of
Jupiter, Saturn and their moons.
Voyager 2 then went on to tour Uranus and Neptune. After some planet-hopping, they were sent on a trajectory toward interstellar
Voyager 1, which is now more than 18 billion kilometres from the sun, has now experienced two out of the three signs of arriving in
interstellar space that scientists hoped to see, according to latest data as published in Science magazine.
The craft has experienced charged particles disappearing as they zoom away along the Sun’s magnetic field, and cosmic rays from
outside zooming in. But it hasn’t yet seen an abrupt change in the direction of the magnetic field, which would prove it’s reached
interstellar space and left the Sun’s field behind.
“This strange, last region before interstellar space is coming into focus, thanks to Voyager 1, humankind's most distant scout,"
said Ed Stone, Voyager project scientist at the California Institute of Technology.
"If you looked at the cosmic ray and energetic particle data in isolation, you might think Voyager had reached interstellar
space, but the team feels that Voyager 1 has not yet gotten there because we are still within the domain of the sun's magnetic
field," Stone added.
Scientists still don’t know exactly how far the craft has to travel to reach deep space. It could take days, weeks, months or even a few
more years - although researchers do think Voyager is close.
The Sun’s influence, marked by the heliosphere, extends at least 13 billion kilometres beyond all the planets in our solar system.
It is filled with the solar magnetic field and an ionised wind blowing from our star. Beyond it lies the interstellar magnetic field
present in the nearby region of the Milky Way and matter from other stars.
The latest data from Voyager 1’s cosmic ray, low-energy charge particle and magnetometer sensors, taken between May and September
2012, with additional information up to April this year, show that the spacecraft is in fact in the magnetic highway.
That region allows charged particles to move into and out of the heliosphere, along a smooth magnetic field line, and it was
here that scientists first detected cosmic rays coming in from dying stars.
"We saw a dramatic and rapid disappearance of the solar-originating particles. They decreased in intensity by more than 1,000
times, as if there was a huge vacuum pump at the entrance ramp onto the magnetic highway," said Stamatios Krimigis, the low-energy
charged particle instrument's principal investigator at the John Hopkins University Applied Physics Laboratory.
"We have never witnessed such a decrease before, except when Voyager 1 exited the giant magnetosphere of Jupiter, some 34 years
Voyager 1 and Voyager 2 toured Jupiter, Saturn, Uranus and Neptune before gunning their engines for the edge of our solar System in
1990. So far, it's been a fantastic journey filled with excitement, and the two probes continue to relay back to Earth amazing images
of millions of planets and stars from our own galaxy, and soon of other galaxies as well.
In other technology news
After several years of complaints from some domain registrars that say the cost to run the Whois service is too high, ICANN is
now proposing to drop the service.
ICANN also said that most of the data is often inaccurate anyway and suggested that some search fields should be restricted to
“authenticated requestors” only.
ICANN even wrote a paper about Whois that seems to suggest that one of the things that's broken in the system is the proliferation
of new gTLDs, which is making the “Internet ecosystem a lot more complex”.
The report notes that “historically, most of these responsibilities were transferred to the registrars, whose primary goal was to
provide working domain names to paying customers.”
Additionally, it can be said that Whois searches don't yield any revenue for the registrars either, another source of irritation for them.
“The EWG concluded that today’s WHOIS model giving every user the same anonymous public access to gTLD registration data should
Instead, the EWG recommends a paradigm shift whereby gTLD registration data is collected, validated and disclosed for permissible
purposes only, with some data elements being accessible only to authenticated requestors that are then held accountable for appropriate
use,” the report states.
Its immediate proposal is for an aggregated registration data service (ARDS) under which registries would no longer need to provide
Port 43 access for searches. While they would hold authoritative data for domain ownership, the aggregators would take a non-authoritative
copy of registration data for “authenticated requestors” to search.
Relieving registries of the obligation to respond to Whois searches would reduce the bandwidth and computing requirements of
the registries, but it would also mean that the ARDS would have to be funded somehow.
As the discussion paper notes: “The issue of cost is an important aspect of the RDS,” suggesting that the gTLD directory services
export working group should “explore this issue further, including costs of development and operation and possible ways in which
these expenses might be borne.”
It offers either RDS funding “offset by value-added service fees” as a possible means to pay for the new intermediate registry search operators.
Source: The W3C.
Get a great Ubuntu Linux dedicated server for less than $3 a day!
Share on Twitter
Email the editor.
Bookmark this Tech Blog by