quarta-feira, 2 de novembro de 2011

IBM Open-Sources Potential "Internet of Things" Protocol

IBM Open-Sources Potential "Internet of Things" Protocol:

Big bright worldThe openly stated goal from IBM is to produce a completely new World-Wide Web, one comprised of the messages that digitally empowered devices would send to one another. It is the same Internet, but not the same Web. This morning in Ludwigsburg, Germany, IBM announced it is joining with Italy-based hardware architecture firm Eurotech in donating a complete draft protocol for asynchronous inter-device communication to the Eclipse Foundation.



It would be the current data explosion, times itself. A projected 24 billion simultaneous devices by the year 2020, including RFID tags on shipping crates, heart rate monitors, GPS devices, smartphone firmware, automobile maintenance systems, and yes, not a joke, earrings may become more socially active than any teenage human being presently alive. Tens of billions of devices, billions of messages per hour.


Sponsor



It is being called Message Queuing Telemetry Transport (MQTT) protocol, the machine-to-machine counterpart of HTTP. This afternoon, ReadWriteWeb was granted exclusive access to the draft proposal, which is being officially received by the Eclipse Foundation later this morning.



"While smart objects and physical world systems are often integrated with Enterprise and Web middleware today, it is often done using proprietary integration models and combinations of a prolific number of custom protocols and industry standards," the final IBM draft proposal reads. "In most established M2M [machine-to-machine] implementations of connected devices, the data producers and data consumers are programmed to interact in strict and well defined ways. For example, in a smart city, sensor-based systems can alert operators of a broken water main and report the extent of flooding in streets and subways. Well-designed open messaging technology would enable solutions well beyond this, allowing public and private transit systems for example, to monitor these critical alerts, adjusting their routes and even notifying commuters and customers of alternative routes, transportation, lodging and meals. Social networks could subscribe, allowing residents and commuters to interact, adapt and even provide feedback and status to the city."



111102 MQTT proposal chart.jpg



Public transit systems could enable the streets themselves to publish their own traffic status. Traffic signals could become intercommunicative, enabling live and automated rerouting of traffic, including signals that are sent to cars and their drivers. Water, gas, and electric lines can report their own status the same way. And through inter-Web protocols, you could check the status of your local water main on Facebook.



While we're on that subject: Imagine if Facebook could find you. Literally, you would never have to log in. An application is foreseeable whereby messaging devices connect to a token device located on your person. When you move from your laptop PC to your tablet to your refrigerator, your Facebook session could follow you.



Although the following potential application was not directly stated, it was implied: Consider the potential for intercommunicative monetary tokens. What if every fiver or twenty or century note could report its location and status?



Further into IBM's draft proposal, the company's engineers make clear their current position on current HTTP-based Web services protocols in what they call an M2M context. They're inadequate, and need adaptation.



"Open source messaging components... will have to work equally well across the constrained networks and embedded platforms that are inherent to physical world of machine-to-machine systems," the latest draft reads. "This will enable a paradigm shift from legacy point-to-point protocols and the limitations of protocols like SOAP or HTTP into more loosely coupled yet determinable models. It will bridge the SOA, REST, Pub/Sub and other middleware architectures already well understood by Web 2.0 and Enterprise IT shops today, with the embedded and wireless device architectures inherent to M2M."



The "pub/sub" model alluded to here is documented in fuller detail at mqtt.org.



Although IBM is touting its continued commitment to open standards, it's also clear that there's payback for IBM in the form of a headstart for middleware support. As this page reveals, amid the existing open source MQTT servers available for experimentation today are servers for message queuing telemetry are based around WebSphere, and messaging brokers are based around Lotus Expeditor.



A great many breakthrough proposals for Internet technologies never actually bore fruit. The Web we could be using now, is tremendously more capable than the one you have in front of you. MQTT is far from being confirmed as the web of future stuff. But today's formal proposal is one of the critical steps that future visions take toward becoming realities.


Discuss



terça-feira, 1 de novembro de 2011

70 High Quality and Very Detailed Adobe Illustrator Object Tutorials

70 High Quality and Very Detailed Adobe Illustrator Object Tutorials:


Advertise here with BSA


23.illustrator tutorial 70 High Quality and Very Detailed Adobe Illustrator Object Tutorials

47.illustrator tutorial 70 High Quality and Very Detailed Adobe Illustrator Object Tutorials

dobe Illustrator is the most preferred vector illustration application.When it comes to technical drawing you can easily see the difference between Adobe Illustrator and other softwares.Although learning or becoming an expert is not as much easy as photoshop or other applications but when

33.illustrator tutorial 70 High Quality and Very Detailed Adobe Illustrator Object Tutorials

More Illustrtor Tutorials






Start your own Design Contest today and choose from 50-200+ custom design made just for you.



Design You Trust RSS Feed | Design You Trust on FB | Design You Trust on Twitter | Design You Trust


How To: Break the speed of light in your own backyard

How To: Break the speed of light in your own backyard:



Minute Physics serves up another nifty video.



Via Jennifer Ouellette



Video Link






A deeper look into Depth of Field

A deeper look into Depth of Field:

A Guest post by Reader Josh Wells



A shallow depth of field is highly sought after due to it’s ability to separate the subject from it’s background and is found in many professional photographs. By now your probably know that larger apertures (f/2.8 and below) correlate to a shallower depth of field whereas small apertures (f/16 and above) will render almost the entire frame in focus. In this tutorial I will explain further factors into controlling depth of field, how much is too much and why different sensor and film sizes give different depths of field.


F/ Stops


F/ Stops are generally the hardest of the three elements contributing to exposure to grasp. Both because of their inverse relationship with brightness and because it’s often hard to understand what is actually happening when you change apertures. To understand this fully it is important to understand what an F/ Stop really means. An F/ Stop meaning Focal stop, is the fraction of a lens’s focal length as measured (Widthwise) passing through the lens, in order to keep the same amount of light. Still confused? Here are some diagrams of how an aperture of F/2 is measured for a 50mm lens and a 100mm lens.


depth-of-fields.jpeg


Whilst scoring no points for artistic merit, I hope these help explain how F/ Stops are measured and how each of these lenses will give the same exposure at each aperture setting. More advanced lenses such as those used in cinematography use T/ Stops or Transmission stops, which factor in the amount of light a lens loses through the elements to give the true amount of light input from the lens. However since this tutorial is about depth of field we do not need to worry about T/ Stops so much.


This aperture as it gets wider effectively baffles the light, so eventually it can only be focused to one small point, bringing the rest out of focus. As it becomes smaller (For example a 20mm lens will have only a 10mm aperture at f/2, meaning a deep depth of field) the light will be less baffled and the lens will have better control focusing it in and around the required focal point.


So how do we get depth of field?


Depth of field is ultimately effected by two things: The aperture of the lens as we have just covered and the distance between subject and background. If you have a lens with the focus distances marked on it have a look at it now, you’ll notice that the distance the focus ring needs to be pushed to go from 7 to 15 feet (Or at 2 to 5 metres for us using the metric system) is about the same as from 2.5 to 3 feet (Or at 0.8 to 1 metres). And then there is hardly any distance from 25 feet to infinity (Try with 15 metres too). So at f/4 on a 50mm lens I could have from 25 feet and everything behind that in focus, or I could have almost the distance between 6 and 4.5 feet.


To utilize this for the shallowest possible depth of field, try to bring as much separation as possible between the subject in focus and the background, for example a headshot will render a distant background out of focus at almost any aperture. If the lens you grabbed also has marks for the depth of field at each given aperture you’ll be able to see how this distance effects each F/ Stop.


Crop Sensors


This can get a bit confusing when you account for different sensor sizes such as APS-C as used by many consumer DSLRs, Four Thirds, and Medium format from some film cameras and some very expensive digital ones. Many people make the correlation between focal length and depth of field and assume that a cropped or four thirds sensor will equal a longer focal length and therefore shallower depth of field. This is not the case due to a cropped sensor working more as the name entails, and literally shooting from an equivalent crop of what would appear on a full frame sensor. Lets explain this with math.


We’ll use a 50mm f/1.8 lens.


So this lens currently has a maximum aperture of 50/1.8 – 50mm/1.8= 27.7mm Aperture


Now lets put this on an APS-C cropped sensor with a crop of 1.6x – 50mm*1.6= 80mm Equivalent Focal length


However the maximum aperture is still 27.7mm. – 80mm Equivalent focal Length / 27.7 Aperture = ~2.8 New equivalent aperture.


We can now think of the lens rather than as an 80mm F/1.8 but more accurately as an 80mm F/2.8.


This also works the other way with larger sensors, which have reversed crop factors and can thus have for example a 90mm F/2.8 lens which will give a crop factor around 0.6.


90*0.6= 54.


Meaning with a medium format sensor of film one can have the depth of field of a 90mm F/2.8 lens however with the angle of view of a near 50mm lens.


Too much Bokeh?!


Whilst in many cases when we want to throw a background out of focus we want as shallow a depth of field as we can afford, there are several cases in which one must find ways to deepen depth of field.


One of the reasons one may have too shallow a depth of field is in Wildlife photography where focal lengths in excess of 300mm are required in order not to disturb the animals, especially when shooting birds. Avian photography can sometimes have so shallow a depth of field that it becomes impossible to accurately focus, or difficult to keep the entire bird in focus due to excessive depth of field. This is avoided through stopping down to smaller apertures and sometimes a flash extender is required to sufficiently light the subject. Such as this Visual Echoes FX3 Better Beamer Flash Extender for Use FX3 B&H or more simple DIY ones.


Another is in Macro photography in which our other variable; Distance, is pushed to an extreme. In macro photography it is not uncommon to require stopping down to f/ 32 in order to gain the necessary depth of field. The light loss from this is aided again with flashes specifically for Macro, often ring flashes.


Post originally from: Digital Photography Tips.



Check out our more Photography Tips at Photography Tips for Beginners, Portrait Photography Tips and Wedding Photography Tips.


A deeper look into Depth of Field




segunda-feira, 31 de outubro de 2011

Test Page For GDrive Appearing In Google Search Results

Test Page For GDrive Appearing In Google Search Results: writely - Google Search-1

In case there was still any doubt about the long-rumored “GDrive’s” existence, a page now appearing on Google’s search results offers a pretty clear indication that something is going on. On Writely.com – the online word processing service Google acquired in 2006 –  a test page is now appearing with a title that reads “test page for Platypus (GDrive).”


Well, there you have it.


Currently, the full title of the search result reads “Writely – The Web Word Processor – test page for Platypus (GDrive)” and the URL is www.writely.com/BasePage.aspx. Of course, when you click through, the link takes you to an error page of sorts, with a message reading “Unknown action. Please check the URL and try again.”



It should be noted that the www.writely.com domain itself redirects to Google Docs.


Platypus, for those unaware, was the codename for GDrive used internally at Google until it was killed off in 2008. But recent findings have hinted that Google Drive is making a return. For example, in September, MG reported that code found in Chromium (the open source Web browser which serves as the testing ground for Google Chrome) referenced the non-public URL drive.google.com.


Later that month, a screenshot from a presentation at a Google-sponsored event showed something that looks very much like Google Drive, complete with text that even reads “My Google Drive.”


From MG’s earlier reports, the forthcoming service is essentially a rebranding of Google Docs with an accompanying desktop software component, similar to Dropbox. When exactly Google will finally launch Google Drive, after years of waiting, is still unknown. But at least we know they’re working on it.


Thanks Dan Behun





Life in a Day now available on YouTube

Life in a Day now available on YouTube: On July 24, 2010, thousands of people around the world recorded videos of their lives to take part in Life in a Day, a cinematic experiment to document a single day on earth. From more than 4,500 hours of footage recorded and uploaded to YouTube, Oscar-winning director Kevin Macdonald and executive producer Ridley Scott created a 90-minute feature film that offers an entertaining, surprising and moving view of life on earth.



After a theatrical release in countries around the world including appearances at the Sundance, Berlin, SXSW and Sydney film festivals, Life in a Day is finally coming home to YouTube—in its entirety, for free.



Starting today you can watch Life in a Day on YouTube, available with subtitles in 25 languages. So if you haven’t seen it yet or want to relive the experience that The Times of London considers “a thrilling piece of cinema” and the Washington Post called “a profound achievement,” now’s your chance.







If you’d like to own Life in a Day, a DVD is also available. You can find more details about this, and the whole project, on the film’s official YouTube channel.



Posted by Tim Partridge, YouTube Marketing Manager




(Cross-posted from the YouTube Blog)


domingo, 30 de outubro de 2011

The Story of Android So Far

The Story of Android So Far:

The team at xcubelabs have kindly helped us to consolidate a very nice pictorial view of the Android story so far. Just an update, according to Networkworld back in Aug 2011, the worldwide market share of Android currently stands at 48%.


Enjoy!



Minha lista de blogs