On Jan 22, 2008 W3C published an early draft of HTML 5, a major revision of the markup language for the Web.
One notable improvement is the adoption of the widely-used but non-standard <embed> element. Continue reading
On Jan 22, 2008 W3C published an early draft of HTML 5, a major revision of the markup language for the Web.
One notable improvement is the adoption of the widely-used but non-standard <embed> element. Continue reading
The current IPv4 protocol used on the Internet is running out of the addresses needed to accommodate the growing number of users online.
The American Registry for Internet Numbers (ARIN), the organization responsible for giving out IP addresses in North America, says that 19 percent of the IPv4 addresses are still available, while 68 percent have been allocated and 13 percent are “unavailable,” whatever that could mean. There are 4.3 billion IPv4 addresses, or 2^32. IPv6 has 2^128 addresses, or 16 billion-billion.
There have been efforts to get more mileage out of IPv4 by using tricks like conversions to IPv6 or using duplicate IPv4 addresses within a firewall. This has helped extend the lifespan of IPv4 but it only prolonged the inevitable.
Until now the biggest obstacle to IPv6 has been the fact that IPv6 address information is not included in most of the root DNS servers that power the Internet. DNS (Domain Name Service) is the Internet service that translates domain names such www.example.com into the numeric IP (Internet Protocol) addresses such as 198.105.232.4 that are actually used to connect computers on the Internet.
Starting on February 4th, at least one of those adoption barriers will be addressed as records for IPv6 addresses are added to four of the key root DNS servers. The inclusion of the IPv6 records could make the adoption and operation of IPv6 a more viable option for network operators.
Chris Beard of Mozilla Labs announced a new project for “deeper integration of the browser with online services.” The goals include:
This is an exciting and very necessary development for Mozilla. As personal data storage is moved from the desktop to the Net, client-side encryption is essential for privacy and security. It is inevitable that the companies offering web apps will suffer a shakeout and some will fold. And security breaches are a fact of online life.
I’m looking forward to integrating this into the ISubuntu project.
Fortune reports on chatbots used in online stores to talk potential customers out of abandoning their virtual shopping carts. “…A startup called UpSellit is … using live chat to act as a sales assistant …. but here’s UpSellit’s twist: That person on the other end of the live chat box isn’t a person at all. You’re chatting with software that’s designed to fool you into thinking it’s a person.” Clearly another step blurring the real and virtual that raises a few ethical and possibly legal questions. How would knowing that you’re talking to a bot change your attitude or behavior? What if you thought you were talking to a bot but it turned out be a real human being?
Network engineer Richard Bennett’s new article for The Register: Dismantling a Religion: The EFF’s Faith-Based Internet explores the difference between the way the EFF wants to see the the Internet managed and current discussions under way in the IETF.
Bottom line: the Internet has never had a user-based fairness system, and it needs one. All networks need one. Continue reading
The Email Standards Project is a new effort whose mission is to drive the use and support of web standards in email, working with email client developers to ensure that emails render consistently.
Matthew Patterson and Mark Wyner, the leadership team behind this effort, have kicked things off with an acid test and a report on Web standards support in popular email clients. Continue reading
In a current feature on Unsung Tech Innovators Computerworld interviews Robert Kahn who, along with Vint Cerf came up with the TCP/IP protocol upon which the Internet is based. Kahn notes that the 30th anniversary of their first successful “internetworking” demo just passed. Continue reading
Some search engines have added their own commands to the rules governing how search engine bots behave. The Automated Content Access Protocol (ACAP) proposal, unveiled Thursday by a consortium of publishers at the global headquarters of The Associated Press, seeks to have those extra commands – and more – apply across the board. Continue reading