sexta-feira, 21 de agosto de 2009

Canon PowerShot G11

Canon PowerShot G11: "

Canon this week announced an addition to their very popular G series line - the Canon PowerShot G11.


The PowerShot G11 features a list of specifications that a mid to higher level photographer looking for a compact camera will want to consider. It has a 10 megapixel high sensitivity CCD sensor, a 28-140mm image stabilized lens, 2.8 inch tilt and swivel LCD screen, Flash Sync of up to 1/2000th of a second, Digic 4 image processing, HDMI connectivity, RAW shooting plus plenty more.

The Canon PowerShot G11 should hit stores in October with a recommended retail price of $499.99. It is already available for pre-order at Amazon at this price.


Canon Powershot G11 News Release

Canon today announces the launch of the feature packed PowerShot G11, the successor to the multi award-winning PowerShot G10 - the favourite compact of professional photographers and photoagencies the world over.

Commenting on his use of the predecessor to the PowerShot G11, the PowerShot G10, Gary Knight, acclaimed photojournalist and co-founder of the VII Photo Agency, said: “As a photojournalist who covers warzones, one of the main challenges I face is getting high quality images in hostile environments.

To achieve this I need a camera that offers great image and build quality without the bulk, and the PowerShot G series is perfect for this purpose. When shooting in areas of conflict, it’s important to have a compact camera in my bag that allows me to work discreetly but also provides the level of quality required to get the photos I need. The G Series excels in this respect, delivering great quality images from a compact body that is less intimidating than that of an SLR.”

Professional photographers will benefit from the G11’s greatly expanded dynamic range. Canon’s new Dual Anti-Noise System combines a high sensitivity 10.0 Megapixel image sensor with Canon’s enhanced DIGIC 4 image processing technology to increase image quality and greatly improve noise performance by up to 2 stops (compared to PowerShot G10). The PowerShot G11 also includes i-Contrast technology, which prevents high-light blowout whilst retaining low-light detail – ideal for difficult lighting situations.

The premium quality Canon lens delivers picture-perfect performance, offering a 5x wide angle (28mm) zoom with optical Image Stabilizer (IS). This allows handheld shots to be taken at much slower shutter speeds (4-stops) than conventional non-IS models - allowing perfect shooting in darker conditions or at a lower ISO. Greatly reducing blur caused by camera shake, photographers can shoot at longer focal lengths and in lower light conditions without the need for a tripod.

The digital compact includes the ability to shoot in RAW format and is compatible with Canon’s Digital Photo Professional (DPP) software, ensuring that photo shoots can be easily integrated into a photographer’s workflow.

Stephen Munday, Senior Operations Director, Editorial, Getty Images, comments: “The quality of images delivered by the PowerShot G series are so good that we use G10 images within our editorial library, and our photographers using the G series models do so because of the image quality and flexibility they provide.”

He continues, “The unpredictable nature of news photojournalism requires our photographers to get shots in all kinds of situations, and the size of the G10 allows them to do that without drawing too much attention to themselves. The low-light capabilities of the new G11 will be of even greater help in that respect, as it will allow our photographers to shoot high quality images without flash in even more situations.”

Alongside superb image quality, the compact size of the PowerShot G11 and a 2.8-inch vari-angle PureColor II VA LCD (461k dots) make it the ideal choice for professional photographers to use where an SLR is impractical or obtrusive.   The G11 is ergonomically designed for faster, accurate menu scrolling. Analogue style dials for ISO and exposure compensation give photographers instant, familiar access to common settings and features.

“I’ve been a user of the G series since the G5” says Edmond Terakopian, photojournalist and winner of the British Press Awards Photographer of the Year and a World Press Photo award for Spot News. “Recently I’ve been using the G10 which is absolutely spot on; image quality, usability, reliability and build quality are all excellent. It’s the one camera that’s always with me, no matter where I am; whether on or off duty. I know I can rely on it to help me get the picture, no matter what.

Over the years I’ve used my G Series cameras on the occasions when assignments have needed discretion. It’s a satisfying feeling telling a picture editor that the photograph he’s just complemented was from a compact camera.”

The G11 is equipped to deal with any light condition. Low Light mode enables photographers to shoot up to ISO 12,800 in reduced 2.5MP resolution at 2.4fps capturing brilliant shots indoors without the need for a flash, whilst a built-in Neutral Density (ND) filter decreases light levels by 3 stops allowing creative control in bright conditions.

The camera’s highest flash synchro speed has been increased to 1/2000th seconds, reducing the possibility of overexposed bright scenes. A real time histogram displays brightness levels on the PureColor II VA LCD screen, so photographers can easily view conditions and change settings when shooting.

The PowerShot G11 gives photographers the freedom to perfectly shoot fast-paced action. Quick Shot mode takes images almost instantly after pressing the shutter, so fast moving objects are always captured, whilst Servo AF/ AE continuously adjusts focus and exposure to optimise settings when photographing moving subjects.

The PowerShot G11 can be used with a wide range of Canon accessories, including the Speedlite 270EX, 430EX II and 580EX II; Macro Twin Lite MT-24EX and Macro Ring Lite MR-14EX flashes for enhanced shooting options; the Speedlite Transmitter ST-E2, Speedlite Bracket BKT-DC1 and Remote Switch RS-60E3. Underwater photographers can even team the PowerShot G11 with the specially designed Waterproof Case WP-DC34 - an underwater housing allowing full control of the camera at depths down to 40m. The PowerShot G11 includes an HDMI port so users can review images on a full HD screen via an optional HDMI cable.

Post from: Digital Photography School - Photography Tips.

Canon PowerShot G11


Usain Bolt - 19.19 NEW WORLD RECORD - 200 meter

quarta-feira, 19 de agosto de 2009

Systemic Risk: Is it Black Swans or Market Innovations?

Systemic Risk: Is it Black Swans or Market Innovations?: "

Below is the latest comment from the Institutional Risk Analyst. My former Fed colleague Dick Alford came up with the idea, then Dennis and I revised and extended. Enjoy and have a great August. — Chris

Systemic Risk: Is it Black Swans or Market Innovations?

The Institutional Risk Analyst

August 18, 2009

“Whatever you think you know about the distribution changes the distribution.”

Alex Pollock

American Enterprise Institute

In this week’s issue of The IRA, our friend and colleague Richard Alford, a former Fed of New York economist, and IRA founders Dennis Santiago and Chris Whalen, ask us whether we really see Black Swans in market crisis or our own expectations. Of note, we will release our preliminary Q2 Banking Stress Index ratings on Monday, August 24, 2009. As with Q1, these figures represent about 90% of all FDIC insured depositories, but exclude the largest money center banks (aka the “Stress Test Nineteen”), thus providing a look at the state of the regional and community banks as of the quarter ended June 30, 2009. Click here to register for The Institutional Risk Analyst or request a trial for our products.

Many popular explanations of recent financial crises cite “Black Swan” events; extreme, unexpected, “surprise” price movements, as the causes of the calamity. However, in looking at our crisis wracked markets, we might consider that the Black Swan hypothesis doesn’t fit the facts as well an alternative explanation: namely that the speculative outburst of financial innovation and the artificially low, short-run interest rate environment pursued by the Federal Open Market Committee, combined to change the underlying distribution of potential price changes. This shift in the composition of the distribution made likely outcomes that previously seemed impossible or remote. This shift in possible outcomes, in turn, generated surprise in the markets and arguably led to the emergence of “systemic risk” as a metaphor to explain these apparent “anomalies.”

But were the failures of Bear Stearns, Lehman Brothers, Washington Mutual or the other “rare” events really anomalous? Or are we just making excuses for our collective failure to identify and manage risk?

The choice of which hypothesis to ultimately accept in developing the narrative description of the causation of the financial crisis has strategic implications for understanding as well as reducing the likelihood of future crisis, including the effect on the safety and soundness of financial institutions. To us, the hard work is not trying to specifically limit the range of possibilities with artificial assumptions, but to model risk when you must assume as a hard rule, like the rules which govern the physical sciences, that the event distribution is in constant flux.

If we as financial and risk professional are serious in claims to model risk proactively, then change, not static assumptions, must be the rule in terms of the possible outcomes. Or “paranoid and nimble” in practical terms. After all, these modeling exercises ultimately inform and support risk assumptions for decisions that are used in value-at-risk (VaR) assessments for investors and for capital adequacy benchmarking for financial institutions.

Even before the arrival of Benoit Mandelbrot in the 1960s, researchers had observed that distributions of price changes in various markets were not normally distributed. The observed distributions of price changes had fatter tails than the normal distribution. Nassim Nicolas Taleb, author of The Black Swan and Fooled by Randomness, and others have dubbed significantly larger extreme price moves than those predicted by a normal distribution as “Black Swans.” Indeed, Taleb and others have linked Black Swan price change events to the recent financial crisis, suggesting in effect that we all collectively misunderstood on which side of the distribution of possible risk outcomes we stood.

The argument is as follows: Current risk management and derivative pricing regimes are based upon normal distributions. Price movements in the recent financial crises were unpredictable/low probability events that were also greater than predicted by normal distribution models. Hence our collective failure to anticipate Black Swan events is “responsible” for the recent crises as mis-specified risk management models failed due to fatter than normal tails.

The alternative explanation, however, links the extreme price movements not to aberrations with respect to a stable, observable mean, but instead to the activation of alternate stable means as a result of jumping discontinuously through tipping points — much in the same way particles jump quantum levels in energy states when subjected to the cumulative effects of energy being added to or removed from their environments. These tipping points are as predictable as the annual migrations of ducks. Swans, alas, rarely migrate, preferring to stay in their summer feeding grounds until the water freezes, then move only far enough to find open water. Sound familiar?

Force feed a system with enough creative energy via permissive public policies and the resulting herd behaviors, and the system will change to align around these new norms, thereby erasing the advantages of the innovators and creating unforeseen hazards. “Advances” such as OTC derivatives and complex structured assets, and very accommodating Fed interest rate policy, resulted in unprecedented leverage and maturity mismatches by institutions and in markets that are the perfect quantum fuel to brew such change.

While the exact timing of each tipping point and magnitude of the crises remains somewhat inexact, the waves of change and the ultimate crisis borne shift are broadly predictable. The probabilities attached to extreme price moves are calculable as the cost of deleveraging an accumulation of innovation risk that must be shed as the system realigns. The “Black Swan’ approach assumes a stable distribution of price changes with fatter than “normal” tails. The alternative posits that the distribution of possible price changes was altered by innovation and the low cost of leverage. It also posits that the new distributions allowed, indeed require, more extreme price movements. Two examples will illustrate the alternative hypothesis.

Once upon a time, the convertible bond market was relatively quiet. The buy side was dominated by real money (unleveraged) players who sought the safety of bonds, but were willing to give up some return for some upside risk (the embedded equity call option).

More recently the market has been dominated by leveraged hedge funds doing convertible bond arbitrage. They bought the bonds, hedging away the various risks. In response to the advent of the arbitrageurs, the spread between otherwise similar conventional and convertible bonds moved to more accurately reflect the value of the embedded option and became less volatile.

When the financial crises hit, however, arbitrageurs were forced to liquidate their positions as losses mounted and it became difficult to fund the leveraged positions. Prices for convertible bonds declined and for a period were below prices for similar conventional bonds — something that had been both unheard of and considered impossible as the value of an option cannot be negative.

Was this a Black Swan type event, or had the market for convertible bonds and the underlying distribution of price changes, been altered? The mean spread between otherwise similar conventional and convertible bonds had changed. The volatility of the spread had changed. Forced sales and the public perception of possible future forced sales generated unprecedented behavior of the heretofore stable spread. The emergence and then dominance of leveraged arbitrage positions altered the market in fundamental ways. What had not been possible had become possible.

Now consider bank exposures to commercial real estate. Numerous financial institutions, hedge funds (e.g. at Bear Stearns), sellers of CDS protection (e.g. AIG) and banks (many of them foreign as reflected in the Fed swap lines with foreign central banks) suffered grievous losses when the real estate bubble popped. Much of these losses remain as yet unrealized.

As investors and regulators demanded asset-write downs and loss realization, many of these institution expressed dismay. They had stressed tested their portfolios, the large banks complained, often with the support of regulators. The large banks thought their geographically diversified portfolios of MBSs immunize them from falls in real estate prices as the US had experienced regional, but never (except for the 1930s) nationwide declines in housing prices. These sophisticated banks incorporated that assumption into their stress test even as they and the securitization process were nationalizing - that is, changing — the previously regional and local mortgage markets.

Was the nationwide decline in housing prices an unpredictable Black Swan event or the foreseeable result of lower lending standards, a supportive interest rate environment, and financial innovation the led to the temporary nationalization of the mortgage market? Risk management regimes failed and banks have been left with unrealized losses that still threaten the solvency of the entire system in Q3 2009.

However, useful or necessary “normal” statistical measures such as VaR might be, it will not be sufficient to insulate institutions or the system from risk arising from rapidly evolving market structures and practices. Furthermore, insofar as models such as VaR, which are now enshrined in the bank regulatory matrix via Basel II, were the binding constraint on risk taking, it acted perversely, allowing ever greater leverage as leveraged trading acted to reduce measured volatility! Remember, the convertible bond market at first looked placid as a lake as leverage grew - but then imploded in a way few thought possible. Is this a Black Swan event or a failure of the stated objectives of risk management and prudential oversight?

We all know that risk management systems based solely on analysis of past price moves will at some point fall if financial markets continue to change. The problem with current risk management systems cannot be fixed by fiddling with VaR or other statical models. Risk management regimes must incorporate judgments about the evolution of the underlying markets, distribution of possible price changes and other dynamic sources of risk.

Indeed, as we discussed last week (”Are You Ready for the Next Bank Stress Tests”), this is precisely why IRA employs quarterly surveys of bank stress tests to benchmark the US banking industry. Think of the banking industry as a school of fish, moving in generally the same direction, but not uniformly or even consistently. There is enormous variation in the past of each member of the school, even though from a distance the group seems to move in unison.

Stepping back from the narrow confines of finance for a moment, consider that the most dramatic changes in the world are arguably attributable to asymmetric confluences of energy changing the direction of human history. It’s happened over and over again. The danger has and always will be the immutable law of unintended consequences, which always comes back to bite the arrogant few who believe they can control the future outcome. And it is always the many of us who pay the price for these reckless leaps of faith.

If the recent financial crises were truly highly infrequent random events, then any set of policies that can continuously prevent their reoccurrence seemingly will be very expensive in terms of idle capital and presumably less efficient markets required to avoid them. If, on the other hand, the crisis was the result of financial innovation and the ability to get leveraged cheaply, then society need not continuously bare all the costs associated with preventing market events like the bursting of asset bubbles.

Policymakers would like everyone to believe that the recent crises were random unpredictable Black Swan events. How can they be blamed for failing to anticipate a low probability, random, and unpredictable event? If on the other hand, the crises had observable antecedents, e.g. increased use of leverage, maturity mismatches, near zero default rates, and spikes in housing price to rental rates and housing price to income ratios, then one must ask: why policymakers did not connect the dots, attach significant higher than normal probabilities to the occurrence of severe financial disturbances, and fashion policies accordingly? Ultimately, that is a question that Ben Bernanke and the rest of the federal financial regulatory community still have yet to answer.

Questions? Comments?

About IRA Products and Services

IRA offers advanced analytics for risk surveillance and investment research via subscription products such as the IRA Bank Monitor for Professionals covering the US banking industry and the IRA Corporate Monitor covering public companies. For a trial subscription or an on-line demonstration, please register here.

IRA Advisory Services including our channel research and diligence support services are available to qualified clients. For more information, please contact our offices.

IRA for Consumers

IRA provides consumers easy to buy online reports to independently check on their banks via our How’s My Bank? system.

IRA on Web 2.0

For updates during the week please follow IRA


terça-feira, 18 de agosto de 2009

Sony unveils slimmer PS3: $300, lands in September (updated!)

Sony unveils slimmer PS3: $300, lands in September (updated!): "

Hardly a surprise, but Sony got on stage today at GamesCom and confirmed what we've all known deep down in our hearts: the new, slimmer PS3 is really real. It'll be out in the first week of September (September 1 in North America and Europe, September 3 for Japan), and will retail for $300 (or 300 Euro, or 29,980 Yen). It's smaller and lighter, has a 120GB HDD, and packs 'all the same features' of the regular PS3 while consuming 34 percent less power and taking up 32 percent less space. Existing PS3 SKUs have their prices dropped a hundie apiece tomorrow in anticipation, so be sure to grab a space heater while you've still got a shot -- though we're not sure why you'd pay $300 for an 80GB PS3 when you can wait a couple weeks and get 120GB in a cuter package. A couple pics of the unveil are after the break.

The new 3.0 firmware will be released concurrently with the PS3 slim, which should provide a breath of fresh air for existing machines. Other new features of the PS3 slim include BRAVIA Sync, which allows you to control the PS3 XMB over HDMI through your BRAVIA TV remote, and System Standby to shut off the PS3 when the BRAVIA TV is off. Sony also claims this new machine will run more quietly than existing PS3 systems, which is good news for people who like to watch movies or have conversations in the general vicinity of their game console. There's also a Vertical Stand, which will retail for $24. Not so awesome is Sony's removal of the Install Other OS feature... farewell, Linux. We hardly knew thee.

Update: We've got press shots! Check 'em all in the gallery below, and be sure to pore over that to-scale comparo pic up top. There's one from above as well, which reveals that the new model is actual 'deeper' than the PS3 fat.

Update 2: Video! Our main man Jack Tretton talks up and shows off his spanking new slim PS3 after the break. We've also got a full rundown of the specs for your perusal, and added some new information above.

Continue reading Sony unveils slimmer PS3: $300, lands in September (updated!)

Filed under:

Sony unveils slimmer PS3: $300, lands in September (updated!) originally appeared on Engadget on Tue, 18 Aug 2009 15:22:00 EST. Please see our terms for use of feeds.

Permalink | Email this | Comments"

Portfolio Viewer Tracks Your Stocks in an Attractive Interface [Downloads]

Portfolio Viewer Tracks Your Stocks in an Attractive Interface [Downloads]: "

Windows/Mac/Linux: Free Adobe AIR application Portfolio Viewer tracks the performance of your investments from your desktop with a handful of helpful charts and graphs.

(Click the image above for a closer look.)

Just launch the application, create a new portfolio, and start adding the stocks you're invested in along with any transactions. Portfolio Viewer autocompletes your stock name and ticker symbol when you're adding an investment, then automatically imports stock values over time, calculates your annual rate of return from your portfolio, determines your investment allocation, and highlights the diversity of your investments in the allocation grid.

I wouldn't count on Portfolio Viewer to help determine all my investment moves, but it's a great little tool to keep an eye on the performance of your portfolio.

Portfolio Viewer is freeware, requires Adobe AIR, and works on Windows, Mac OS X, and Linux.


segunda-feira, 17 de agosto de 2009

August 17, 2009 Stock Market Recap

August 17, 2009 Stock Market Recap: "

The bears must really be hating this market. Right after all the news about how much short interest has dropped, indicating that they were being forced out of their positions, we get a day which would have been a nice payoff. In a flash the indices are at new August lows. There was no serious volume to accompany today's drop and the indices are still above their March trendlines. That makes me think that this is nothing but a normal pullback in a larger uptrend. The bears will need to break some of the support levels in the charts below, especially the June highs, in order to get the upper hand.

Trend Table

All the short-term trends switched to down today.

TrendNasdaqS&P 500Russell 2000

(+) Indicates an upward reclassification today

(-) Indicates a downward reclassification today

Lat Indicates a Lateral trend

*** I'm simply using the indices' relations to their 200, 50 and 10-day moving averages to tell me the long, intermediate and short-term trends, respectively.


Growth in Potential GDP

Growth in Potential GDP

Daily Observations from GaveKal

We are hearing concerns, from some clients and friends, that the brutal corporate cost-cutting seen in the wake of the subprime crisis will delay the recovery, because this trend is killing the US consumer. In other words, how can one spend if he has lost his job or fears as much, or has seen his work hours drastically reduced, taken a pay cut, or expects his company pension system is about to implode? For us, this all boils down to a crucial question: do we need consumption to pick up in order to achieve a rebound in growth? The answer to this question very much depends on whether one accepts a Keynesian view of the economic process, or a Schumpeterian (or classical) view. We hope our readers forgive us, but we are now going to have to get a tad theoretical....
* In a Keynesian view, consumption is the motor of growth. If companies slash their payrolls, consumption contracts and we enter into a vicious cycle in which the subsequent decline in demand leads to a second wave of cuts, which then leads to a further decline in consumption, and so on and so forth. The Keynesian cycle may have been useful from 1945 to 1990, but in the past 20 years, globalization and just-in-time technologies have changed the nature of corporate management, which is why we believe a classical, capital-spending led view of the economic cycle will reassert itself.
* In a classical view, as exemplified by "Say's law" and reinforced by Schumpeter, corporate profitability is the cause, not the consequence, of economic growth. Thus, Schumpeter would see the current cycle as the destruction phase in the creative-destruction processes that propel the economic cycle. Capital and labor are currently moving from the sectors in decline (e.g., McMansions) to the sectors in expansion (e.g., tech, alternative-energy infrastructure, etc.). Once momentum in the growth sectors overwhelm the decaying ones, then macro growth resumes. Under this framework, consumption kicks in at the end of the cycle (for more on this, see the very first paper published by GaveKal, Theoretical Framework for the Analysis of a Deflationary Book).
Within our firm, Charles is the major proponent of the Schumpeterian view, and this thinking was apparent in his and Steve's recent ad hoc, A V-Shaped Recovery in Profits. Due to the quick reflexes that new technologies allow, corporates are managing their cash flow better than ever. Rarely ever, for instance, have companies (ex-financials) remained in such strong positions during a recession, which is yet another reason why we believe that capital spending, rather than consumption, will spark the recovery.
Indeed, the scale at which corporates have been able to cut costs and return to profitability, has laid the groundwork for a deflationary boom of epic proportions (which would be a major surprise for those who fear an easy-money inflationary nightmare). Of course, there is a major threat to this deflationary-boom scenario-and that is the increased government intervention we are seeing in most corners of the world. If government intervention manages to kill off return on investment capital, as it did in the 1930s, then the current opportunity will go up in smoke. Regular readers know we tend to err on the side of optimism; at this point we still hold out hope that a major lurch to a big-government era can be resisted-as exemplified, for example, by the unexpectedly strong fight we are seeing against the health-care bill, or the ability of so many US financials to pay back their debt to the US Treasury, thus lowering the extent of government influence on their business decisions. Thus, in our view, a period of deflationary boom is the likeliest scenario, and investors should focus on sectors and countries that will see the largest resurgence in capital spending.

Growth in "Potential GDP" Shows Limited Potential

By John P. Hussman, Ph.D.
Historically, two factors have made important contributions to stock market returns in the years following U.S. recessions. One of these that we review frequently is valuation. Very simply, depressed valuations have historically been predictably followed by above-average total returns over the following 7-10 year period (though not necessarily over very short periods of time), while elevated valuations have been predictably followed by below-average total returns.
Thus, when we look at the dividend yield of the S&P 500 at the end of U.S. recessions since 1940, we find that the average yield has been about 4.25% (the yield at the market's low was invariably even higher). Presently, the dividend yield on the S&P 500 is about half that, at 2.14%, placing the S&P 500 price/dividend ratio at about double the level that is normally seen at the end of U.S. recessions (even presuming the recession is in fact ending, of which I remain doubtful). At the March low, the yield on the S&P 500 didn't even crack 3.65%. Similarly, the price-to-revenue ratio on the S&P 500 at the end of recessions has been about 40% lower than it is today, and has been lower still at the actual bear market trough. The same is true of valuations in relation to normalized earnings, even though the market looked reasonably cheap in March based on the ratio of the S&P 500 to 2007 peak earnings (which were driven by profit margins about 50% above the historical norm).
Stocks are currently overvalued, which – if the recession is indeed over – makes the present situation an outlier. Unfortunately, since valuations and subsequent returns go hand in hand, the likelihood is that the probable returns over the coming years will also be a disappointingly low outlier. In short, we should not assume, even if the recession is ending, that above average multi-year returns will follow.
That conclusion is also supported by another driver of market returns in the years following U.S. recessions: prospective GDP growth. Every quarter, the U.S. Department of Commerce releases an estimate of what is known as "potential GDP," as well as estimates of future potential GDP for the decade ahead. These estimates are based on the U.S. capital stock, projected labor force growth, population trends, productivity, and other variables. As the Commerce Department notes, potential GDP isn't a ceiling on output, but is instead a measure of maximum sustainable output.
The comparison between actual and potential GDP is frequently referred to as the "output gap." Generally, U.S. recessions have created a significant output gap, as the recent one has done. Combined with demographic factors like strong expected labor force growth, this output gap has resulted in above-average real GDP growth in the years following the recession.
The chart below shows the 10-year growth rates in actual and potential GDP since 1949 (the first year that data are available).
The blue line presents actual growth in real U.S. GDP in the decade following each point in time. This line ends a decade ago for obvious reasons. The red line presents the 10-year projected growth of "potential" real GDP. This line is much smoother, because the measure of potential GDP is not concerned with fluctuations in economic growth, only the amount of output that the economy is capable of producing at relatively full utilization of resources.
One of the things to notice immediately is that because of demographics and other factors, projected 10-year growth in potential GDP has never been lower. This is not based on credit conditions or other prevailing concerns related to the recent economic downturn. Rather, it is a structural feature of the U.S. economy here, and has important implications for the sort of economic growth we should expect in the decade ahead.
The green line is something of a hybrid of the two data series. Here, I've calculated the 10-year GDP growth that would result if the current level of GDP at any given time was to grow to the level of potential GDP projected for the following decade. This line takes the "output gap" into effect, since a depressed current level of GDP requires greater subsequent growth to achieve future potential GDP. Notice here that even given the decline we saw in GDP last year, the likely growth in GDP over the coming decade is well under 3% annually - a level that we have typically seen in periods of tight capacity (that were predictably followed by sub-par subsequent economic growth), not at the beginning stages of a recovery.
The situation is clearly better than it was at the 2007 economic peak, where probable 10-year economic growth dropped to the lowest level in the recorded data, but again, the likely growth rate is still below 3% annually over the next decade even given the economic slack we observe.
Aside from a gradual recovery of the "output gap" created by the current downturn, there is no structural reason to expect economic growth to be a major driver of investment returns in the years ahead. With valuations now elevated above historical norms, there is no reason to expect strong total returns on aninvestment basis either.
The primary element that is favorable at present is speculation – excitement over the prospect that the recession is over. Investors are presently anticipating the good things that have historically accompanied the end of recessions (strong investment returns and sustained economic growth), without having in hand the factors that have made those things possible (excellent valuations and a large output gap coupled with strong structural growth in potential GDP).

John F. Mauldin

How to Enable Bookmark Syncing in Chrome, Without an Add-on [Google Chrome]

How to Enable Bookmark Syncing in Chrome, Without an Add-on [Google Chrome]: "

Okay, so just a few minutes ago we pointed out that Xmarks now syncs bookmarks with Chrome—albeit in closed alpha test. Now Google announces that the latest Chrome dev channel release——adds bookmark synchronization without any add-ons.

It's interesting timing, to be sure, and we'd guess the Xmarks folks were probably watching the dev channel and decided to rush the announcement to beat Google to the punch. The fact is, Xmarks still offers something fairly different from what Google's sync tool will offer: namely, Xmarks will be able to sync between Chrome and virtually any other popular browser, including Firefox, IE, and Safari.

Chrome's sync tool will only work with Chrome, which means if you haven't decided you're ready to be a full-on Chrome adopter, Xmarks will probably remain the better option. Still, it's great to see progress all around with bookmark syncing in Chrome.

You'll need to use the Google Chrome Channel Chooser to join the Dev channel to get this release, start Chrome with the --enable-sync flag, then just go to Wrench -> Sync my bookmarks.

Dev Channel Update [Google Chrome Releases]


Minha lista de blogs