Be sure to tag your official answers with #chromeOSmas, otherwise you’re getting coal.We’ll collect the email addresses of the winners and send them along to Google, who will contact you to collect more information. Then you’ll be getting your Chrome notebook right quick, just in time for the holidays.UPDATE: We got word from Google that they can only send notebooks to U.S. addresses. Apologies to our international friends.Thanks for playing and happy holidays from ReadWriteWeb. A Web Developer’s New Best Friend is the AI Wai… seamus condron This week, Google gave the world the first major update on Chrome OS since the project was announced last year. While Google’s operating system in the cloud won’t be ready for prime-time for six months, Google initiated a pilot program that includes a brand new test notebook with Chrome OS installed. While many have been selected for the program and have already received their machine, many of you are still dying to get their hands on Google’s latest project. Well, we have some good news!Google has been kind enough to give us five Cr-48 notebooks with Chrome OS installed, and we’ve decided to give them away to our awesome readers.Beginning next Monday, we’ll give away one notebook each day via our new @ReadWriteMobile Twitter account. To be eligible, you’ll need to do the following:Follow @ReadWriteMobile on Twitter Sharpen your brains. We’ll be asking a brain-busting trivia question on topics to be determined by our crack team of quiz show drop-outs. The first one to answer correctly wins! Related Posts Tags:#Google#web Be on your toes (or tweeting toes). We’ll ask our daily trivia question at completely random times. We’ll give about ten minutes notice, but that’s it. Devious! Top Reasons to Go With Managed WordPress Hosting Why Tech Companies Need Simpler Terms of Servic… 8 Best WordPress Hosting Solutions on the Market
When the Association of American Publishers (AAP) released its sales figures for the month of February, the headlines were easy to compose: e-books have surpassed print in all trade categories.E-books have become the format-of-choice, these figures suggest. In January, the AAP said that e-book sales were up 116% year-over-year, and for the month of February that growth accelerated even further. February 2011 sales were up 202.3% from the same time last year. Audiobook sales have also continued to grow. They were up 37% year-over-year for February, following an increase for the January period as well. That increase, along with the rise in e-books, point the change in our book consumption habits: clearly we are “reading” (or listening) on-the-go – in the car, on a mobile device.Of course, a trade paperback has always been a fairly mobile “device” on its own accord. But it’s size and weigh hasn’t saved it from falling sales. E-books became the number one format in all categories of trade publishing in February, surpassing adult hardcover, adult paperback, adult mass market, children’s and young adult hardcover and children’s and young adult paperback.Is This a Post-Holiday Sales Trend or Something Longer Lasting?The AAP suggests that this surge is a continuation of post-holiday sales, as people buy e-books to load onto the e-readers they received as gifts. Whether or not this trend will continue long after the novelty of new tech toys wears off remains to be seen.What’s also worth watching: not simply the sales of new titles, but the renewed consumer interest in old titles, backlisted books that have been in print for over a year but that people want to buy, again, to load onto their new e-readers.While that excitement to buy books might sound like good news for the publishing industry, the buzz over e-books hasn’t stopped sales overall from falling. For the year-to-date, sales of e-books have grown by almost 170% to $164 million. But the sale of print books, which is still a far larger portion of overall publishing revenue, has fallen by almost 25% to $442 million.Tom Allen, the CEO of AAP, puts a positive spin on the news: “people love books.” Perhaps. And perhaps e-readers will spur a new passion for reading (and buying). But for many readers, it may be less that we’re buying more books, but that we’re buying books in a new format, taking away from the revenue from the sale of $25 hardcovers that have long floated the industry and now purchasing our books in $10 digital formats. That means the publishing industry has to sell a lot more e-books to make up that difference. Tags:#E-Books#web Related Posts Why Tech Companies Need Simpler Terms of Servic… audrey watters Top Reasons to Go With Managed WordPress Hosting A Web Developer’s New Best Friend is the AI Wai… 8 Best WordPress Hosting Solutions on the Market
Following up on my comments on green product verification – One More Thing to Worry About– I recently learned about ICC ES-SAVE (Sustainable Attributes Verification and Evaluation), yet another program to verify the sustainability of products.While still new, at first glance, this program appears to get it right. They have rigorous criteria, including on-site inspections of factories, providing end users with assurance that a company’s claims of sustainability are true. Their goal is to provide “technically accurate product information that can be helpful to those seeking to qualify for points under major green rating systems (U.S. Green Building Council’s LEED, Green Building Initiative’s (GBI) Green Globes, or the proposed National Green Building Standardâ„¢).”Nine Faces of SustainabilitySAVE has nine criteria under which products can be certified:• Biobased Material Content• Solar Reflectance, Thermal Emittance And Solar Reflective Index Of Roof Covering Materials• Regionally Extracted, Harvested Or Manufactured Materials Or Products• Volatile Organic Compound (Voc) Content And Emissions Of Adhesives And Sealants• Volatile Organic Compound (Voc) Content And Emissions Of Paints And Coatings• Volatile Organic Compound (Voc) Content And Emissions Of Floor Covering Products• Formaldehyde Emissions Of Composite Wood And Engineered Wood Products• Certified Wood And Certified Wood Content In ProductsProducts that meet their criteria are issued a Verification of Attributes Report or VAR (they seem to be slightly acronym obsessed, but I’ll forgive them), confirming that their claims are true. The first, and thus far only, VAR issued is for bio based content in the new Icynene spray foam. It is short and sweet, simply stating the fact that they have 8% bio based content in their product (with +/- 3% variation – is that 3% of 8% or 3% of 100%? I could use an explanation).Washing our Hands of GreenwashingCertifying buildings under the various green programs thus far has relied on manufacturer’s statements that products meet the various requirements for recycled, local, bio-based content, and other related criteria. This method, with little if any oversight, is wide-open to greenwashing. By providing independent third-party verification, the ICC should raise the bar for all programs, eventually eliminating the potential for incorrect or unsubstantiated claims. It should also help provide credibility that some of the industry created and managed programs may lack. As this program develops, we may see program points for sustainable products requiring ICC-SAVE, or equivalent, certification.Marking Their TerritoryI am sure that there will be plenty of fussing over which certification is better, or even acceptable. It would be a shame if every “certification” program is accepted by all the green building programs. It is likely, however, that the more rigorous programs will accept only a higher level third-party certification, while others will be less restrictive. Just as we are seeing NAHB, LEED, Energy Star, and other local and regional programs out there marking their territory, I expect that the various product certifications will do the same. Think dogs and fire hydrants. It will be interesting to watch.
Matt and Kelly Grocoff’s house, a Folk Victorian in Ann Arbor, Michigan, fits comfortably in their neighborhood’s 19th century architectural vernacular. But the Grocoffs purchased the 110-year-old building, in 2006, for more than its decorative appeal and its place in Ann Arbor’s vaunted Old West Side Historic District. One of the real selling points, especially for Matt, was that house was something of a mess.“It was a dream come true,” Grocoff, founder of GreenovationTV and the green renovation expert for Old House Web, says in an OHW video about the house. “It had lead paint, asbestos siding, zero insulation, and even an old gas-powered lawnmower out in the shed!”Playing off the notion that the greenest house is the one you don’t build, Grocoff is using the upgrades to the Victorian to show how deep an energy retrofit can go in a house that not only is very old (by U.S. standards) but subject to relatively strict historic-district rules for renovation. It has turned out to be a long-term project, to say the least, but if Grocoff has his way, the house will soon operate at net zero energy. And because he and his wife (who have a young daughter) plan to stay in the house indefinitely and live in it as it makes its crawl toward better energy efficiency, Grocoff has been justifying many of the improvements by factoring in prospective savings on energy costs and preservation of original materials.Tending to basicsHe told local news service AnnArbor.com, for example, that the monthly utility bill for the home, which was operating with a 50-year-old furnace, was $350 in January 2007, when he installed a geothermal heat pump. (There was room for three 150-foot boreholes on the small lot.) The geothermal system halved the utility bill, which had been about $2,800 annually – a projected $56,000 over the next 20 years, Grocoff notes, if energy prices remained the same.The attic presented another dream opportunity to save money. It had no insulation (except for a layer of newspaper from 1902), so Grocoff had the air leaks sealed and 20 inches of blown cellulose installed, bringing the thermal resistance of the attic floor to R-50. He also hired a contractor to remove siding panels on each face of the exterior, where cellulose was blown into the walls.Dealing with the windows in the house required a workaround, since historic-preservation rules in the area prevent homeowners from replacing originals. The Folk Victorian’s 15 double-hung windows, before restoration, were almost as leaky closed as open, so Grocoff called in a local window restoration expert, Lorri Sipes, for a tutorial on dismantling and refurbishing the wooden parts, sealing leaks, repairing broken channels, and other steps needed to get the windows in shape. That process cost a few hundred dollars – replacements would have run about $15,000 installed – and a fair amount of labor on Grocoff’s part before the windows were ready for paint. The remaining issue, of course, was further reducing air leakage at each window, but the historic district allows installation of storm windows, so Grocoff ordered them from George W. Trapp Co., based in Redford, Michigan.A pitch for solar and preservationGrocoff calculates that getting the house to operate at net zero energy – he calls it his “Mission Zero” – will require a solar power system that he plans to buy from SunPower Corp. of San Jose, California, and have installed on the building’s south-facing roof. He expects to spend about $14,000 on the purchase and installation after tax credits from the federal government and credits from the local utility company, Detroit Edison Energy. The system will have paid for itself, he figures, in about five years.That he was able to win the Ann Arbor Historic District Commission’s approval for the idea, Grocoff said in his interview with AnnArbor.com, could mean good things for many of the district’s other historic buildings, not only in terms of their energy efficiency but also their preservation.“Historic houses are really inefficient,” he said. “They’ve got old, leaky windows, and so there’s a long way to go in those historic houses because they’re already so inefficient. So we have to fix those. We’re not going to tear them down. It would be foolish and environmentally bad judgment to tear down old houses. So we want to preserve them, and a way to do that is to make them as energy efficient as possible and then produce renewable energy onsite.”
Alex is founder of BuildingGreen, Inc. and executive editor of Environmental Building News. Watch for a forthcoming BuildingGreen special report on windows. To keep up with his latest articles and musings, you can sign up for his Twitter feed. RELATED ARTICLES I’ve examined state-of-the-art windows and glazing systems over the past four weeks. This week, I’ll cover an innovative product that may help define the state-of-the-future: a dynamic glazing called SageGlass that can be tinted on demand. To understand what’s so exciting about such a product, let’s look at conventional high-performance windows.When we select a window for a particular application we have to determine what performance properties will be best for that particular application — on average. In the winter and for daylighting in all seasons, we usually want to maximize sunlight transmission, while in summer and for certain windows at particular times of day we usually want to restrict solar gain to control glare or unwanted heat gain.By carefully choosing the glazing, we can do a pretty good job at balancing those differing priorities–but it is always a matter of compromise. If we want to be able to vary the sunlight coming through a window we have historically had to rely on a separate system, such as an interior window blind or exterior shading system. In an office building, these systems can get pretty complex.Enter dynamic glazingsWith dynamic, or switchable, glazing, we can adjust the solar and visible transmittance properties of the glazing according to the conditions. Dynamic glazing can be passive, with tinting controlled by temperature (thermochromic) or by sunlight (photochromic), or they can be controlled by electricity (electrochromic). With the latter, tinting is actively managed by the user, though the process can be automated. High-Tech Windows with Dynamic GlazingsSmart Glass Maker Opens New PlantGBA Encyclopedia: SkylightsGBA Product Guide: Wasco SageGlass SkylightsGBA Product Guide: SageGlass Insulated Glass Unit SageGlass, the electrochromic glazing made by Sage Electrochromics, is the only widely marketed dynamic glazing on the market today, and the company has made some significant advances since I last covered the product in this blog a few years ago.SageGlass relies on a low-voltage electric current to tint the glazing. The company uses a sputtering process (like that used to produce soft-coat low-e glazings) to coat a pane of glass with five extremely thin, highly specialized ceramic layers. These coatings are typically applied on the #2 surface (the inner surface of the outer pane of glass). When voltage is applied across these coatings, according to Sage, ions travel from one layer to another, prompting a reversible solid-state change that causes the coating to tint and absorb light. Reversing the polarity of the applied voltage causes the ions to migrate back to their original layer, returning the glass to its clear state.The electrochromic coating in SageGlass is also a low-e coating, so whether or not the tinting is employed, heat loss through the window is reduced, as it is with other low-e glazings. The electric current needed to tint the glass is very small (about 0.3 watts per square foot of glass), and even less current (0.1 watt/square foot) is required to maintain the tinted state. When the current is turned off, it reverts to the normal, clear, state.This level of electricity consumption is quickly paid for by reduced cooling loads when the tinting is deployed — at least in commercial buildings, where SageGlass is most commonly used.I had an opportunity to talk with the Sage Electrochromics CEO, John Van Dine, recently, and learned about some exciting innovations. The biggest recent development is the ability to set intermediate levels of tinting. When SageGlass first came out, it was either in its clear or tinted state, with solar transmittance of 62% and 3.5%, respectively (assuming the coating is on standard clear glass in an insulating glass unit or IGU).Now, when this variable tinting option is included (for an upcharge), SageGlass can be “programmed” with two intermediate tinting levels. This provides much greater flexibility for the user, since the fully tinted state is darker than what most users would want. One of the intermediate tinting levels is typically set at 15-20% visible light transmission and the other at 35-40%.An advance that is being actively worked on but is not yet available with SageGlass is powering the tinting using photovoltaic (PV) electricity. With this feature, the installation may be simpler if the power system is integrated into the glass or frames somehow. We’ll have to see how that feature evolves.A new factorySage Electrochromics has a new factory nearing completion this month, though it will take several months to fully commission the manufacturing equipment. SageGlass production will be significantly automated, and much larger IGSs, up to five feet by ten feet, will be able to be produced.While the automation in the new plant should eventually bring prices down (SageGlass is not an inexpensive product!), Van Dine cautioned that “you will not see a big price reduction initially.” Because of the customization with most glass orders, pricing is heavily dependent on the volume of SageGlass needed for a project, the number of different glazing sizes in the order, special shapes, laminated and other specialty glass needs, the number of control zones, and what degree of control functionality is needed. While the company is reluctant to mention specific costs, given these variables, I gather that, for a typical project, incorporating SageGlass into IGUs will add about $50 per square foot, but for small projects the cost can be significantly higher.Van Dine emphasizes that SageGlass offers not just glazing, but an entire solution to fenestration control. It can replace both exterior shading systems and interior blinds — which in commercial buildings can be very pricey. When the SageGlass system is compared with these multiple systems (glazing plus separate glazing attachments), SageGlass can be quite cost-competitive.The company has partnerships with more than a dozen manufacturers of commercial glazing systems, residential windows, and skylights. Residential windows and skylights that can now be ordered with SageGlass include Marvin, H Window, and Wasco Skylights.To date, SageGlass has been installed in hundreds of projects, but it plays a major role in the overall lighting control in significantly fewer. The product is extensively used in a new 300,000 square-foot Siemens wind-turbine facility that has just been built in Hutchinson, Kansas. During the design of that facility, the SageGlass option came out less expensive than two other sunlight control strategies that were evaluated and more expensive than a third–which was a very simple shading system.Sage Electrochromics is located in Faribault, Minnesota, about 40 minutes outside of Minneapolis. Several years ago, Saint Gobain, the world’s largest building products company, invested in the company and is now a majority owner.
1. Evaporator coilThis is where the refrigerant picks up heat from inside the house. The evaporator coil is a copper tube, which carries the refrigerant, embedded in a framework of aluminum fins (photo below). Using this configuration, the refrigerant is connected to a lot of surface area that makes contact with the air blowing over it, which aids heat transfer from the air to the refrigerant. The most common geometry is the A-coil (below), but you also see flat coils and N-coils in some units.The refrigerant comes into the evaporator coil as a liquid at a low temperature and low pressure. The air handler’s fan (a.k.a. the blower) blows air from the house across the coil. The evaporator coil is cold (about 40° F), and the air from the house is warm (about 75° F, depending on where you set your thermostat). Heat flows from warmer to cooler, so the air temperature drops, and the refrigerant picks up the heat lost by the air. This is the Second Law of Thermodynamics in action.In addition to getting warmer, the refrigerant also changes phase here. It’s called the evaporator coil, after all, so the cold liquid refrigerant coming in evaporates and becomes a vapor. Phase changes are a great way to transfer heat because it takes a lot more heat to cause a phase change (especially between liquid an vapor) than it does to change the temperature of a material. Thus, when the refrigerant starts boiling, it really sucks up the BTUs (British Thermal Units).One aspect of the flow may be confusing when you look at your air conditioner and try to figure out how the refrigerant flows from inside to outside. If you have a split system, the inside and outside pieces are connected by the refrigerant lines, often called simply the “line set.” These are two copper tubes, one larger and insulated tube (the suction line) running parallel to another smaller and uninsulated tube (the liquid line). When the vaporized refrigerant leaves the evaporator coil, it’s still fairly cold. Hence, the refrigerant going to the compressor outside is in the larger, insulated copper tube, not the smaller, warmer tube. As paradoxical as it seems, the colder tube is the one carrying the most heat. You can see that the suction line is the colder tube in the photo of the line set below because it even has frost on it, which is not normal and indicates a problem (e.g., incorrect amount of refrigerant or air flow over evaporator coil too low). 4. Expansion valve — where the magic happensOnce the refrigerant gets back to the indoor unit, it passes through a metering device, such as an expansion valve, and the magic of the refrigeration cycle happens here. The high pressure, relatively warm liquid runs into a constriction that doesn’t allow the refrigerant to pass through easily. As a result, when the liquid does get through to the other side, it finds itself in a much lower pressure. When the pressure drops like this, so does the temperature — a lot! This is what makes air conditioning possible. Without being able to get the refrigerant down to temperatures below the air in your home, an air conditioner wouldn’t be able to work. Why? Because heat flows from warmer to cooler, the old Second Law of Thermodynamics again.After passing through the metering device, the refrigerant goes directly into the evaporator coil, and the cycle begins anew.There you have it. If you followed what I wrote above, you’re ready to head out to the park benches. 3. Condenser coilWhen the high pressure, high temperature, still-vaporized refrigerant leaves the compressor, it enters the condenser coil. As with the evaporator coil, it’s a copper tube embedded in aluminum fins that allows for efficient heat transfer. A fan inside the condensing unit pulls outdoor air through the sides of the coil and blows it out the top of the unit. Because of the work the compressor did, the refrigerant is hotter than the outdoor air. The Second Law of Thermodynamics kicks in here, and heat flows from the warmer refrigerant to the cooler outdoor air blowing over the condenser coil.In the evaporator coil, refrigerant changes from liquid to vapor at a relatively low temperature. Now we’re dealing with higher temperatures, and the phase change reverses. If you’re confused by the phase change happening at different temperatures, think about what happens when you go to high altitudes. The air pressure is lower, and water boils at less than 212° F. Pressure changes affect the boiling/condensation point temperature, which is why boiling happens at a low temperature in the evaporator coil and condensation happens at a high temperature in the condenser coil.After returning to the liquid state, the refrigerant travels through the liquid line (the hot, uninsulated copper tube) back to the indoor part of the air conditioner. The refrigeration cycleThe way your air conditioner pumps heat from inside to outside is through the refrigeration cycle, a thermodynamic cycle involving a special fluid — the refrigerant — that undergoes phase changes (between liquid and vapor), pressure changes, and temperature changes. The diagram below shows how the refrigerant flows through the air conditioning system and what’s happening along the way.First, notice that the diagonal line shows which parts are inside the house and which parts are outside. If you have a split system, with the condensing unit sitting outside making noise and the air handler inside (which could mean in the crawl space or attic), the components are in different boxes. In a window AC or package unit, they’re all in the same box.Now, let’s go through the four stages of the refrigeration cycle, one by one. RELATED ARTICLES Air Conditioner Basics The Difference Between Air Conditioners and DehumidifiersWindow-Mounted Air Conditioners Save Energy With the recent heat wave that set all kinds of records across the US, including an all-time high of 106° F here in Atlanta, air conditioning has become quite the topic of conversation. Why, just yesterday I overheard two little old ladies* on a park bench debating thermostatic expansion valves versus capillary tube metering devices — and almost coming to blows over it! Most people aren’t quite as informed as these air conditioning aficionados, though, so today would be a good time to get up to speed on how your air conditioner works so that you’ll be in the know when you take your next stroll in the park.I’m going to give an intermediate explanation of the refrigeration cycle here, so if you’d like to take a step back and start with a more basic one, see this explanation of how an air conditioner works I wrote for lay people. (Another good article that covers basics of the refrigeration cycle as well as answers to some commonly asked questions is Martin Holladay’s Air Conditioner Basics.) Today we go a little deeper, though. I’ll name the parts of the refrigeration cycle and talk a bit more about the changes in the refrigerant as it goes through the cycle.Still, there’s a lot I’m not going to cover in this article. I’m not going into the technical details of checking refrigerant levels, the properties of the different refrigerants used, how efficiency (SEER rating) is calculated, or any number of other important aspects of air conditioning systems. 2. CompressorWhen the refrigerant reaches the outdoor part of your air conditioner (photo at top), the compressor does exactly what its name suggests: it squishes the refrigerant down to a smaller volume, thus increasing the pressure and the temperature. Why? Because heat flows from warmer to cooler. You can’t get rid of heat from a cold refrigerant to 95° F air. The refrigerant has to be warmer than the outside air. That’s one reason why we need the compressor. The other is that the compressor is the pump that moves the refrigerant through the system. *Well, OK, maybe I didn’t really overhear this conversation, but I swear I have been in parks and seen little old ladies talking on park benches.
This article is only available to GBA Prime Members What goes under the concrete in a slab-on-grade home? In the old days, not much — just dirt. Eventually, contractors discovered that it made sense to include a 4-inch-thick layer of crushed stone under the concrete. The crushed stone provides a capillary break that reduces the amount of moisture flowing upward from the damp soil to the permeable concrete.Since the crushed stone layer provides a fairly uniform substrate, it also may also reduce the chance that a concrete slab will be poorly supported by random pockets of soft, easily compressible soil.Eventually, polyethylene was invented. Concrete contractors learned that a layer of poly helps to keep a slab dry, because it stops upward vapor diffusion from the soil.Finally, some contractors in cold climates began installing a continuous horizontal layer of rigid foam insulation under their concrete slabs. The foam layer isolates the room-temperature slab from the cold soil under the slab.At this point, we’ve got a sandwich with three or four layers, and the question arises: does the order of the different layers matter? What goes down first, and what goes down last?According to most building scientists, here’s how the layers should go, from the bottom up: crushed stone; rigid foam; polyethylene; concrete.Some contractors may ask: Is it a mistake to put the polyethylene lower down in the sandwich? The answer is yes. To understand why, it’s useful to study the history of blotter sand.Beginning in 1989, the American Concrete Institute (ACI) recommended the installation of a 4-inch layer of granular material between a sub-slab vapor retarder and a concrete slab. ACI standard 302.1 R-96, Guide for Concrete Floor and Slab Construction, included this recommendation in Section 4.1.5: “If a vapor barrier or retarder is required due to local conditions, these products… Sign up for a free trial and get instant access to this article as well as GBA’s complete library of premium articles and construction details. Start Free Trial Already a member? Log in
A Leaky Old House Becomes a Net-Zero ShowcaseStrategies and Details: Building Details for a Deep Energy Retrofit Study Shows That Expensive Windows Yield Meager Energy ReturnsThe High Cost of Deep Energy RetrofitsNumber Crunching on a Deep Energy RetrofitRemodel Project: Deep Energy RetrofitRetrofits versus ReductionsDeep Energy Retrofits Are Often Misguided Paul Eldrenkamp is the owner of Byggmeister, a design/build firm based in Newton, Mass. Rachel White is the company’s performance manager. This blog, the third in a series of posts about the project, originally was published at the Byggmeister website. The first and second posts are Preparing a Historic Home for the Next 100 Years, and Planning for Net Zero Energy. But with this project we had two things going for us: a relatively accommodating building, and highly motivated homeowners (not to mention our top-notch planning team, led by David Foley of Holland and Foley Architecture).The home’s relatively simple form made it possible for us to superinsulate the building shell and drastically cut air leakage. This drove the heat loss down low enough that we were able to significantly downsize the mechanical system, replacing the existing gas boiler with three minisplit heat pumps for both heating and cooling. (Heat-recovery ventilation ensures a steady supply of fresh air.)Equally important is the home’s unshaded south-facing roof, which supports an 11.7-kW PV system projected to produce 15,000 kWh per year, or 120% of what the house is modeled to consume. (For context: according to the Energy Information Administration, the average Massachusetts household consumes roughly 32,000 kWh annually, or two and a half times as much as this house). Existing houses are a bigger challengeThe same can’t be said for existing homes. On the contrary, retrofitting a home to operate on a net-zero basis is a pretty audacious undertaking. Most of the homes we work on push back hard against such ambition. There are several obstacles, including our relatively harsh climate; the older, architecturally complex character of our housing stock; the frequency with which homes have been renovated without attention to efficiency; and the prevalence of smaller lots, particularly in urban settings. RELATED ARTICLES A few months ago we completed a deep energy retrofit of a house that we hope will be net-zero energy — in other words, that we hope will produce as much energy as it consumes on an annual basis. If we succeed, this will be our first net-zero project.There are two key strategies for designing a net-zero-ready home: you minimize the amount of energy the house needs to operate; and you maximize the amount of energy that the house can produce onsite, usually with a photovoltaic system (PV). In short, you create an energy budget that balances consumption with production.Implementing these strategies on a new home is relatively straightforward. In fact, leaders in the high-performance building industry have been building net zero ready homes for years now, and some are even doing so at market rate. (To learn more about one Massachusetts company, Transformations Inc., that is building market-rate and affordable net zero energy homes, click here.) It’s up to the homeowners to make it workNow it’s up to the highly motivated homeowners to actually make it to net zero — i.e., to live within their energy budget. We are currently in the process of gathering monthly usage and production data to determine how close we get to this goal. We are also monitoring at a more finely grained level, increasing the likelihood that we will be able to get there.With the help of the SiteSage eMonitor system, we are collecting circuit-by-circuit usage data. Monitoring this closely has several purposes, not least of which is it tells us whether equipment is working properly. When a usage pattern looks atypical, we can sometimes use the eMonitor to help us troubleshoot in real-time.We can also use circuit-by-circuit monitoring to help the homeowners develop net zero “habits” — to give them the feedback they need to live within their energy budget. We can let them know when they’re at risk of exceeding their budget and overdrawing their “energy account.”While in many respects living in a net-zero energy home is easier than living in a standard home (for one thing, a well-designed net-zero energy home is far more comfortable from a thermal standpoint), net-zero living does take some adjustment. Much of the conventional wisdom about operations and maintenance, particularly in regards to mechanical equipment, doesn’t translate well to low energy homes.For example, minisplit heat pumps operate most efficiently when they are maintained at a fairly constant temperature. The eMonitor data can help us identify whether the homeowners are laying down a net-zero habit of “setting and forgetting” their minisplits, or are falling back into a habit of using setbacks.It remains to be seen whether we will achieve our goal of net zero energy. If we do, there’s no question that thoughtful operations on the part of the homeowners, supported by monitoring on our part, will have played a key role in getting us there — at least as key as lucking into such an accommodating house in the first place.
I love insulation. It’s a wonderful thing because it saves energy. It makes buildings more comfortable. And it’s pretty inexpensive considering how long it lasts (or should last). I get asked a lot for my opinion on the best insulation to put in a building and my answer is straightforward: A well-installed insulation is the best. I like fiberglass. I like cellulose. I like spray foam. I like mineral wool. I like blown, sprayed, batt, and rigid insulation.Yeah, different materials have different properties, with their advantages and disadvantages. But if it’s installed well and protected by good water and vapor control layers, it should do its job for a long, long time.So, what are my two ways to make sure you get the most out of your insulation? Both have to do with installation.1. Request a minimum thicknessWay back in 2011 I wrote an article called “Is There a Downside to Lumpy Attic Insulation?” I refer to it now and then but it’s important enough to make it the highlight of this article. The point of the article was that if you install insulation uniformly, as in the lead photo above, you’ll get much better performance than from insulation installed (or later disturbed) like you see in the photo reproduced as Image #2 at the bottom of the page. Flat beats lumpy. In that article, I showed an example of an attic done two different ways. First, you insulate the attic uniformly to a thickness that gives you R-30 everywhere. You can’t do this in a typical attic because the roof framing doesn’t give you enough space over the eave walls to get full thickness so you’d have to do something like use raised-heel trusses. But we’re going to assume here that you get full thickness everywhere because that’s what you should be doing, even if it’s not required by code.In the other scenario, I looked at what happens if you take the same amount of insulation and install it so that you have enough thickness for R-10 on one side of the attic and R-50 on the other side. Your first guess may be that the average resistance to heat flow would be R-30 since 10 and 50 average to 30.But you’d be wrong. In the article about lumpy insulation, I showed the calculation and it comes out a lot less than R-30. In fact, at R-17 it’s about half. That means you have almost twice as much heat flow even though you have the same amount of insulation.So, rule number one is to make sure your insulation contractor isn’t selling you on average thickness. That means they’re getting away with selling you less R-value.Where this matters the most is when you have less thickness of insulation. For example, if you’re using closed-cell spray polyurethane foam, you’re getting an insulation with a much higher R-value per inch than many of the other insulation types. So you usually get less thickness.In a 2×4 wall where you need R-13 to meet code, spray foam contractors usually install two inches of closed-cell spray foam. Since it’s usually rated at about R-6.5 per inch, that means if, say, 25% of your wall has only 1.5 inches, you get about R-10 there instead of R-13. If the rest of the wall is right at 2 inches thick, your average R-value in the cavities is 12, not 13. You’re not getting what you paid for.2. Request Grade I installation qualityIn addition to making sure you get the thickness to achieve the R-value you’re paying for, you should also make sure the insulation is installed in other ways that ensure it achieves maximum R-value. RESNET created an insulation grading protocol back in 2006 and certified home energy raters have to use that protocol for every rating they do. When they’re inspecting a house, they have to determine the R-value for each insulated assembly and also the grade (Grade I, II, or III). Grade I is the best, Grade III the worst.The protocol is based on looking for two things. First, the amount of missing insulation determines what grade it might be. The RESNET illustration is reproduced as Image #5 below. The dark areas represent gaps in the insulation.Officially, Grade I means essentially no gaps, Grade II can have up to 2% gaps, and Grade III can have no more than 5% missing insulation.The other factor is compression and incomplete fill. The insulation might fill the cavity completely from side to side and top to bottom but still have a reduced R-value if it’s compressed or doesn’t fill the cavity completely from front to back. (I’m thinking of walls when I use those directional terms. Adjust as necessary for ceilings and floors.) I wrote a thorough explanation of the grading protocol back in 2012, so check it out for more detail.And yes, Grade I is possible with fiberglass batts, too. I’ve seen it done a few times, as in the photo below (Image #6) from a Habitat for Humanity project in Nashville.TakeawaysWhen you get insulation, you want to make sure you get the full R-value you’re paying for. Do these two things:Insist on having the insulation installed to a minimum thickness, not an average thickness.Insist on Grade I installation quality.By the way, if you read the manufacturer’s instructions for installing insulation, they generally align with Grade I installation quality — so you’re not really asking for anything special here.This certainly isn’t all there is to getting a good insulation installation. Before you ever get to the installation part of the job, way back in the design phase, it’s a good idea to see what you can do to eliminate thermal bridging and make sure you can get full thickness everywhere (as with raised-heel trusses). Allison Bailes of Decatur, Georgia, is a speaker, writer, building science consultant, and the author of the Energy Vanguard Blog. You can follow him on Twitter at @EnergyVanguard. RELATED ARTICLESIs There a Downside to Lumpy Attic Insulation?The Fundamentals of Series and Parallel Heat Flow Three Code-Approved Tricks for Reducing Insulation ThicknessWhat’s the Definition of an ‘R-20 Wall’? Installing Closed-Cell Spray Foam Between Studs is a WasteGrading the Installation Quality of Insulation
This post originally appeared at Ensia. Susan Liley didn’t set out to become an activist. “A grandma, that’s all I am,” she says. But when her hometown of De Soto, Missouri, flooded four times in three years, Liley felt called to act. After the first couple of floods, Liley did what she could do to help her neighbors: She dragged waterlogged furniture from a friend’s home and delivered eggs from her chickens to those without electricity. But the third time around, Liley says, “I got mad.”RELATED ARTICLESFlood, Rebuild, Repeat: The Need for Flood Insurance ReformFlooding Is More Than a Coastal ProblemHome Buyers Face Stacked Deck to Learn of Past FloodsUrban Flooding: A Problem That’s Getting WorseIs It Time to Move Our Cities? Across the U.S., flood survivors are growing in number and — like Liley — they’re getting mad and fighting back. From city streets to subdivisions and trailer parks, they are comparing notes with neighbors and asking hard questions about the rising tide. They are messaging each other on Facebook, packing meeting halls and lawyering up. And, increasingly, they are seeking not just restitution, but answers. Flood survivors are identifying the root causes of repeated flooding and working toward solutions. Most recently, their ranks were swelled by a March “bomb cyclone” in the Upper Midwest, which unleashed catastrophic flooding that was visible from space. According to the 2018 National Climate Assessment, climate change is driving more severe floods in many parts of the country. Sea-level rise is inundating coastal cities, where “sunny-day flooding” is now a thing. Rising seas contribute to high-tide flooding, which has grown by a factor of five to 10 since the 1960s in many U.S. coastal communities — and that trend is expected to accelerate in the future. Farther inland, increased rainfall is a major culprit. Because a warmer atmosphere holds more water vapor, the past few decades have seen many more “heavy precipitation events,” especially in the Northeast, Midwest and upper Great Plains. In the Northeast, for example, heavy rains pack 50% more water than they did before 1991. Not surprisingly, those deluges have led to more flooding from Albany, New York, to Duluth, Minnesota. Not just the climate But climate isn’t the only reason we are seeing more floods. Ill-conceived development, especially in flood-prone areas, replaces water-absorbing forests and wetlands with impermeable surfaces — so there is simply nowhere for all that water to go. While the risks of building in a floodplain may seem obvious, such construction continues nonetheless — in part because waterfront properties are in high demand, commanding premium prices that boost real estate tax income for local governments. In De Soto, both factors are at play. There is more precipitation, according to Liley: “It used to be 3 or 4 inches of rain, and now we get 7 to 10.” But the town also hugs the banks of flood-prone Joachim Creek. Over the years, construction of new homes and roads has thwarted the creek’s natural drainage and put more people in harm’s way. Liley remembers tragedy striking in 2003, when a flash flood in Joachim Creek led to one death. “We didn’t realize it was a preview of things to come,” Liley says. In 2013, another flash flood killed two people: an elderly woman who was washed away by the torrent, and another who died while being evacuated. When De Soto flooded again in 2015, Liley reached her limit. “Three of us ladies were talking on Facebook and said we have to do something. So we met the next morning, and organized the Citizen’s Committee for Flood Relief.” The committee’s first priority was to figure out some kind of early warning system. While coastal and riverine floods can be (imperfectly) predicted in advance, flash floods by definition arrive unannounced. Second, they sought to understand the root causes of repeated flooding and address them. Higher Ground lends a hand Liley’s group got a powerful assist from an organization called Higher Ground (formerly Flood Forum USA). A project of the nonprofit Anthropocene Alliance, Higher Ground is the largest national flood survivor network in the U.S. It currently links 43 flood survivor groups in 20 U.S. states — inland and coastal, urban and rural, representing a wide range of demographics and political affiliations. Higher Ground was founded by Harriet Festing, a former British civil servant and goat farmer who came to the U.S. in 2011 when a Conservative government eliminated the climate and energy department for which she worked. Festing took a job with the Center for Neighborhood Technology in Chicago. There she met a woman named Helen Lekavich, a hairstylist-turned-organizer who demonstrated what a passionate group of flood survivors could accomplish. After enduring repeated floods in her town of Midlothian, Lekavich and her neighbors organized a group called Floodlothian Midlothian, which eventually won a $7.6 million flood control project from the Metropolitan Water Reclamation District. With 41 million people estimated to be living in flood zones, Festing says, “imagine if we could find Helen Lekaviches across the country and create a unified voice! So that’s what we set out to do.” She reached out to survivors’ groups — finding them on Facebook, in local media and through word of mouth — and Higher Ground was born. “The leadership to address flooding and other climate impacts needs to come directly from the people and communities that are most affected,” says Festing. But these issues are complex, requiring expertise beyond the understanding (and pocketbooks) of survivor groups. So, in partnership with the American Geophysical Union’s Thriving Earth Exchange and three other partners, Higher Ground matches flood survivors with experts in hydrology, floodplain management, citizen weather monitoring, insurance, law, case management, planning, and architecture. And Higher Ground links survivors’ groups with one another, so they can trade notes and strategies — for example, by holding a monthly videoconference and leadership forum. In De Soto, Higher Ground matched Liley’s group with scientists from Saint Louis University and the U.S. Geological Survey who helped create a simple but effective flood warning system. Sensors in Joachim Creek now send messages to a phone app that pings residents when the creek rises over a certain level. “When it’s 8 feet over, we’re in trouble,” says Liley. “But when it’s 10 feet over, you better be out of there because it’s going to be in homes.” Higher Ground helped Liley’s group petition their senators and members of Congress to commission a $200,000 watershed study for the city of De Soto. Conducted by the U.S. Army Corps of Engineers and its state-level Silver Jackets team, Liley says the study will show how green infrastructure, such as restored wetlands and parks, can minimize flood risk along Joachim Creek. The study’s completion was delayed by the recent federal government shutdown. And other hurdles remain — namely money. “All this work that the Corps of Engineers has done, without funding for implementation, we will get nowhere,” Liley says. Still, identifying the problem is a crucial first step. A flooding whodunit Sometimes, identifying the problem has all the drama of a whodunit. That’s how it played out in Richwood, Texas, where residents rode out Hurricane Harvey without any notable flooding. Then, “four days after Harvey vamoosed on out of here, water started backing up into our neighborhood,” remembers Kevin McKinney, a self-employed transportation safety consultant. McKinney had 3 feet of water in his home for nine days. “I lost everything I had,” he says. Yet, despite Harvey’s historic rainfall totals, something did not sit right for McKinney and his neighbors. “There are people who have lived here for 45 to 50 years, and never, ever flooded,” McKinney says. “Why now?” Richwood residents did some investigating; one even deployed a camera-equipped drone to get a bird’s eye view. They claim to have discovered that the City of Lake Jackson used pumps and sandbags to divert floodwater to Richwood’s Bastrop Reservoir, which overflowed into Richwood residents’ homes. “They had three pumps going at 6,800 gallons a minute, running for 10 days,” says McKinney. “The water was actually flowing uphill.” The City of Lake Jackson denies the charges. The people of Richwood organized. They formed a Facebook group called Flood Victims of Richwood and called meetings that packed a local church. And they joined up with Higher Ground, which matched them to a hydrologist who is using lidar data to analyze the post-Harvey flood. Now more than 400 homeowners are suing the City of Lake Jackson for $45 million, according to Matias Adrogue, the lawyer representing the citizens of Richwood who brought the lawsuit. McKinney says the goal of the lawsuit is to find out what happened and make sure it doesn’t happen again. And he wants to see the survivors compensated for their losses. But there is a deeper principle of fairness he wants to address: “We need to find a solution together,” McKinney says. “You just don’t flood your neighbors.” The rich get richer Questions of fairness are increasingly on flood survivors’ minds. Floods are sometimes seen as equal-opportunity disasters that affect rich and poor alike. But a substantial body of research (highlighted in a recent exposé by NPR) shows that federal aid actually leaves wealthy, white communities better off after natural disasters — while the reverse is true for low-income communities of color. Constance C. Luo, a community organizer for the Texas Organizing Project in Houston, has seen this play out in the recovery from Hurricane Harvey. “Harvey did not discriminate,” she says. “People in richer areas did severely flood, and it was terrible. But whether you got assistance depended on things like the flexibility of your employer or whether you had flood insurance. So many wealthy families found themselves to be prosperous after Harvey, while other families go bankrupt.” The people who went bankrupt, Luo says, are those who work low-wage jobs and cannot take time off work to navigate the complex bureaucracy of disaster assistance. A disproportionate number come from the low-income African-American and Latino neighborhoods of Northeast Houston, where a lack of investment in infrastructure and poor drainage led to a high number of flooded homes. Given that disparity, the Texas Organizing Project fought for — and won — a county program that prioritizes investment in low-income neighborhoods for flood recovery and prevention. But that plan has drawn fierce opposition from affluent Houstonians who say bond funds should be evenly dispersed throughout the city. “The question,” says Luo, “is whether the bond projects should be equal to everyone, or equitable — weighted toward neighborhoods that traditionally have had very little attention to their flood infrastructure. We stand on the side of equity.” To bolster its case for equitable flood recovery, the Texas Organizing Project joined up with Higher Ground in 2018. The group was matched with geologist Edith Newton Wilson, owner of Rock Whisperer LLC in Tulsa, Oklahoma, who is mapping flood risks in Northeast Houston. The maps show high and low ground, bayous, drainage infrastructure, and other factors that shape risk and resilience. For Luo and other community residents, the maps are revelatory. “There’s real power to being able to identify your place on a map, and say ‘Oh! People on the other blocks near me suffer from this, too! Oh! We’re all in the floodplain! That’s why our insurance is so high.’” In this way, the mapping project is educating Northeast Houstonians about flood risk management — and providing vital data for advocacy. “I strongly believe that community, fighting hand in hand with science, is an unbeatable force,” says Luo. The future of flooding That unbeatable force will have much to contend with in the decades to come, as climate change and development raise flood risks across the U.S. In some places, those risks pose an almost existential challenge; the future of the community hinges on finding better ways to channel, divert, and live with water. Charleston, South Carolina, is one such place. Thanks to sea-level rise, land subsidence, and development in low-lying areas, Charleston is on track to experience sunny-day flooding more than half the year — 187 days — by 2045. “What does that mean?” asks Eileen Dougherty, who runs a commercial fishing business in Charleston. “That’s going to massively change the way that we live. That affects our basic safety services, our firefighters. Can the ambulance get to your house? Can children get to school? So, we have a lot of things to look at here in Charleston.” Dougherty — like Liley and McKinney — became an unwitting activist on this issue when her land began to flood. The culprit, she believed, was the new 294-unit apartment building next door, which had altered the soil and the flow of water through the neighborhood. She reached out for help from the local municipalities, to no avail. Dougherty now believes that development in Charleston takes what she calls a “whack-a-mole approach,” where large developments are popping up at an alarming rate without adequate drainage solutions and are flooding surrounding properties. So Dougherty got involved with a group called Fix Flooding First — another Higher Ground affiliate — because she wants to see a more comprehensive approach. “We need to have all the municipalities, the governing agencies, on the same page with building and zoning in a way that incorporates best practices,” she says. “We need to build in a way that preserves our natural environment, preserves our culture, and preserves our ability to have that tourism revenue. And I think we can do all that.” While each community’s challenges are unique, common themes and challenges call out for action at the state or federal level — and even in the most vulnerable places, there is much that can be done to reduce the toll of flooding. For example, across the nation, developers continue to build in floodplains, finding workarounds to ordinances and federal regulations — and, according to Festing, they sometimes adopt dubious tactics to do so. Higher Ground members are alerting one another to these tactics and reporting them to the appropriate authorities, Festing says. In this way, they hope to spark change at a national scale. There is no way to sugarcoat the challenges ahead. But as the waters rise, so do awareness and determination. Flood survivors are no longer simply victims; they are an ever-growing constituency for change. They are asking vitally important questions. They are challenging longstanding development practices and demanding a more equitable distribution of risks and rewards. They are grappling with the changing climate and its implications for the places we call home. And they are joining forces. “The big resonating thing that runs through my mind is unity,” says Dougherty. “If you can create a united voice, a united front, that is very powerful.” This article was produced by the Island Press Urban Resilience Project, with support from the Kresge Foundation and the JPB Foundation. Laurie Mazur is editor of the project.
Step flashing a roof to a wall with continuous exterior insulation requires a decision: Do you bring the step flashing back to the structural wall sheathing behind the continuous exterior insulation? Do you flash to the face of the continuous exterior insulation? Or do you install flashing at both layers?There’s more than one approach that can work and a lot depends on where the water control layer, the water-resistive barrier (WRB) is—behind the continuous exterior insulation or on the face of the continuous exterior insulation. The sequence of the work plays a part too. Is the continuous exterior insulation applied before the roofing is installed or after the roofing is installed? Then there’s the question, “What about re-roofing in the future?”Here are three approaches I’ve used:Whichever route you choose to take, the first step is the same: protecting the roof-to-wall intersection with peel-and-stick flashing.Before anything else is done, a kickout flashing with 5 inch to 6 inch wall and roof legs is installed. This flashing would go over the ice barrier membrane (click here for how to install a kickout flashing). Then an 18 inch-wide strip of self-adhering, self-sealing flashing tape is applied 6 inch on the wall and 12 inch out onto the surface of the roof:A 2 inch strip of the release sheet is left along the roof leg of the flashing to lap together with the roof underlayment. The underlayment is bonded to the 2 inch strip of flashing and folded over, back towards the wall to form a ‘J’ dam:This continuous flashing tape will collect any water that may get beyond the siding and step flashing and drain down to the eave edge where the kickout will divert it away from the wall.It’s considered best practice to use a drainable WRB behind… This article is only available to GBA Prime Members Sign up for a free trial and get instant access to this article as well as GBA’s complete library of premium articles and construction details. Start Free Trial Already a member? Log in
Related Posts Google’s short-term goals for its Motorola smartphone division are two-fold: clear the rubble on the runway and build for the future. When it comes to runway, Google is still dealing with the product pipeline that it inherited from Motorola when the acquisition received final approval by regulatory bodies way back in February 2012. Speaking at the Morgan Stanley Technology Conference this week, Google chief financial officer Patrick Pichette said Google still has to deal with 18 months of product pipeline that it has to “drain right now.” That product pipeline was on full display in September when the latest Droid Razr devices were released. The reaction to the Droid Razr HD, Razr Maxx HD and the Razr M was a universal yawn, with all three devices seen as no more than iterative updates to the aging Razr series as Google continues to liquidate Motorola’s existing design and assets. (See also The New Motoroloa: Google’s Hardware Division Steps Into The Future.)Even the next new smartphones that will soon come out of Motorola are not anything to write home about. Pichette seems well aware of that fact, calling them, “not really to the standards that what Google would say is ‘wow’ — innovative, transformative,” according to a report from The Verge. Essentially, Google has to clear away years of Motorola mediocrity, past and future, before it can really build a device that will stand out as the quintessential Android smartphone. That is where Guy Kawasaki comes in.Kawasaki Joins Google To Advise MotorolaGuy Kawasaki – venture capitalist, publisher and early Apple evangelist – loves Android. We learned this in December, and heard more details when he sat down with ReadWrite editor-in-chief Dan Lyons for a ReadWrite Mix event in San Francisco last December. The revelation that Kawasaki, such a staunch Apple advocate for so long, was an Android fanboy was shocking to many people in the pro-iPhone camp. (See also Shock And Awe: Apple Legend Guy Kawasaki has Become A Hardcore Android Fan.)Apparently, his devotion to Android runs deeper than he originally let on.Kawasaki tweeted on Wednesday that he will be joining Google as an advisor to Motorola. Kawasaki will focus on “product design, user interface, marketing, and social media,” according to a post on his Facebook page.“Motorola reminds me of the Apple of 1998: a pioneer in its market segment, engineering-driven, and ripe for innovation. I believe that great products can change everything. For example, the creation of the iMac G3 (the Macs that came in colors such as Bondi, Strawberry, Blueberry, Lime, and Grape) was a pivotal event for Apple,” Kawasaki wrote. What’s Next For Motorola?Between Pichette and Kawasaki, we’re getting a pretty good idea of the direction of Motorola under Google’s stewardship:The first thing that needs to be done is to clear the pipeline.The next is to employ smart designers and idea people like Kawasaki to create transformative products under the Motorola name. Will that be the mysterious “Motorola X” that Google has supposedly been working on?If we date Motorola’s 18 months of product pipeline from the time when the acquisition was approved by the U.S. Department Of Justice, Google still has nearly six months or so worth of Motorola supply chain to suffer through. It would not come as a surprise to see at least one more iteration of the Razr series from MotoGoo before we we find out what Motorola-under-Google really has up its sleeve. After all this time, it had better be something that will make us all say “Wow!”Top image: Motorola Droid Razr M by Dan Rowinski dan rowinski Role of Mobile App Analytics In-App Engagement Tags:#Android#Google#Guy Kawasaki#Motorola Why IoT Apps are Eating Device Interfaces What it Takes to Build a Highly Secure FinTech … The Rise and Rise of Mobile Payment Technology
For the first time in a long time, Android 2.3 Gingerbread is no longer running on the majority of Android smartphones. According to Google’s dashboards, the two-year-old flavor of its Android mobile OS now runs on only 44.2% of devices.It’s been a long, slow death for Gingerbread, which was released in December 2010. At the time, it offered a great leap in usability over Android’s two previous builds — Éclair and Froyo — and was the first version of Android to really take off with users worldwide.But with the subsequent release of Ice Cream Sandwich (Dec. 2011) and two iterations of Jelly Bean (the first of which was in July 2012), most Gingerbread devices are either holdovers from people that have not yet upgraded to a new device, have had there firmware updated by their carriers and manufacturers or cheap devices that can be found in torrents in emerging markets. Google names each version of Android alphabetically after tasty desserts. Android 2.2 was Frozen Yogurt, or Froyo; 2.3 was Gingerbread; 4.0 is Ice Cream Sandwich and 4.1-4.2 is Jelly Bean. (We’ll just overlook the ill-fated 3.0 release dubbed Honeycomb.) Google analyzes the usages of each version by tracking how many devices access the Google Play app store in a given month.Another Way Android Is Like WindowsWhen ReadWrite managing editor Fred Paul wrote the other day that Android is like the mobile version of the Windows PC operating system, he was right in more ways than one. Fred mostly meant that Android is becoming a malware haven the way Windows once was. But the two OSes have a similar relationship when it comes to distribution.For years, older versions of Windows have lingered in certain regions long after the rest of the world has moved on. That’s a somewhat bigger deal for Windows, since Microsoft puts out new versions roughly every three years — Android, by contrast, has had seven named versions since Sept. 2009. According to StatCounter’s global breakdown, Windows XP (2001 release, 6.6% global share) and Vista (2007 release, 24% share) are still running on roughly 30% of all Windows PCs. Even in the U.S., XP and Vista combined account for almost a full quarter of the entire Windows ecosystem.Why do older versions of Windows stick around so long? In part, because Microsoft licenses a variety of manufacturers to make laptops and PCs and sell them across the world. Certain regions (Korea, for instance) or sets of buyers — such as federal, state and local governments — hang on to older versions longer to save money. Those same buyers also tend to favor cheaper PCs with older technology when it comes time to upgrade, and PC manufacturers cater to those customers.With Android, The U.S. Is Ahead Of The CurveThe same phenomena occurs with Android, just on an accelerated scale. If you look at Android distribution in the U.S. against the rest of the world, you’ll see that the rate of adoption for newer versions is much higher here than across the Android base as a whole.See the chart from Google on Android distribution from Feb. 2013 below. Gingerbread 2.3.x still makes up 44.2% of the entire ecosystem. Froyo makes up 7.6% and Éclair 1.9%. Overall, that means that the global distribution of Android is still majority running some version of 2.x at 53.7%. Newer versions of Android have been gaining steam recently but are still in the minority. Android 4.0.x Ice Cream Sandwich reaches 28.6% of all Android devices while the newest Jelly Bean builds are on 16.5%. In the U.S., the trend is reversed. According to numbers obtained by mobile analytics company Localytics, Ice Cream Sandwich builds were on 33% of Android devices in the country as of the end of Feb. 2013. Jelly Bean in the U.S also outperforms the global base of Android as a whole, at 21.62%. Together, the newest versions of Android make up 54.48% of Android devices in the U.S. Related Posts The Rise and Rise of Mobile Payment Technology dan rowinski What it Takes to Build a Highly Secure FinTech … Why IoT Apps are Eating Device Interfaces Role of Mobile App Analytics In-App Engagement Gingerbread still accounts for the largest single slice at 38% in the U.S., but its share is starting to evaporate as manufacturers upgrade certain older devices and releasing new, popular devices — such as the Samsung Galaxy Note II, the Galaxy S3, the HTC One, the Motorola Razr HD series and so on — running the latest versions of Android.Weeding Out Gingerbread In The Rest Of The WorldWhen it comes to mobile distribution, the U.S. is the leading indicator of new smartphone adoption, at least when it comes to Android and Apple’s iPhone. Gingerbread devices in the U.S. are increasingly hard to find at carriers like AT&T and Verizon, and are often very cheap or free when they turn up. In the rest of the world, Gingerbread’s long tail is still very prevalent, with companies like Huawei and ZTE serving mid-to-low-end Gingerbread phones to emerging markets like China, Indonesia and India. The next wave of Android devices to hit the U.S. will just extend this trend. The HTC One runs Android Jelly Bean 4.1.2 while the upcoming Samsung Galaxy S4 is expected to run Jelly Bean 4.2.2. Huge marketing campaigns for both devices will probably push up share of the newest Android builds to 65% or higher by mid-year.But just like older Windows versions have been harder to dig out of in the rest of the world, Android 2.3 will remain a large presence on a global scale in the foreseeable future. Top photo: HTC One Tags:#Android#smartphones
nick statt Guide to Performing Bulk Email Verification Tags:#Digital Lifestyle#entertainment#movies#Reddit#social media Related Posts The Dos and Don’ts of Brand Awareness Videos When the iconic actor Samuel L. Jackson makes an appearance on social news site Reddit, he doesn’t mess around with with some typical ask-me-anything (AMA) thread. No, the Big Kahuna Burger-stealing, snake-killing, shark bite-suffering, Jedi Master takes it to a whole new level by kicking-off a game-changing Reddit contest: The most upvoted 300 word-comment on the thread will be read out loud in monologue form by Jackson himself. We can only assume the monologue will be recorded and shared, but either way it’s a hilarious and cunning twist on the general celebrity Reddit AMA. Celebrities everywhere should be taking note that the best way to engage the Internet is not only to challenge them, but to do so in a way that adds value only you can provide. In this case, it’s Jackson’s unique ability to rant. (Read more: Bill Gates Does An AMA On Reddit – Promotes Robots, Speech & More)Get. Your. Ass. Up.At publication time, the leading comment (with an astounding 25,000 upvotes) comes from Reddit user teaguechrystie. Titled “my new alarm clock,” it’s a stirring speech designed to help get your out of bed. Try reading this line in your head in Samuel Jackson’s voice: “You think it’s supposed to come easy? You think I’ve had it easy? Get your ass up. Get. Your ass. Up. Right now.”The rest of the rant is sort of like that, but the roughly 220 more words are studded with far too many f-bombs to reprint here. Hopefully, the actor will soon be stringing together this rant in an online video that could serve as inspiration for slug-a-beds everywhere. (Read more: I Am A President — Obamamania Shuts Down Reddit)Charitable MotivesJackson’s real Reddit motive is to raise awareness for the Alzheimer’s Association. His Reddit post promotes chances to meet him in the UK for lunch with a $3 donation on his Prizeo page. A $200 donation gets an autographed Kangol hat, his trademark headwear of choice. The contest is slated to run through Thursday. We’ll add video of the winning monologue as soon as it’s available. Update on 5/31/13 at 12:54 p.m. PT: Jackson has released the first video to YouTube after announcing that he would in fact be picking two monologues, one based on upvotes and the other being one of his choosing. This first release appears to be Jackson’s personal choice considering many Reddit users are pointing out that the submitted monologue was not in the running for the top upvote slot, still currently held by “my new alarm clock.” Here it is folks: Facebook is Becoming Less Personal and More Pro… A Comprehensive Guide to a Content Audit
readwrite Tags:#Google#now#nsa#Prism Google, eager to salvage its security-related reputation in the wake of disclosures about the NSA’s PRISM surveillance program, has asked a secretive intelligence court to let it disclose more details regarding government requests for information about its users, reports the Washington Post. In a legal filing Tuesday, Google cited a First Amendment right to speak about the information it must legally provide to the government. The company is seeking to have the Foreign Intelligence Surveillance Court lift a gag order that prevents companies from discussing or describing surveillance orders issued by that court, even in general terms. (See also: Tech Firms And Others Are Sharing — A Lot — With U.S. Spies And The Pentagon) A Web Developer’s New Best Friend is the AI Wai… Related Posts Top Reasons to Go With Managed WordPress Hosting 8 Best WordPress Hosting Solutions on the Market Why Tech Companies Need Simpler Terms of Servic…
What it Takes to Build a Highly Secure FinTech … Role of Mobile App Analytics In-App Engagement Related Posts There has been much handwringing over this past summer about how Apple’s new iOS 7 will affect people’s favorite apps. Apple changed the design of iOS this year to feature more colors and have a sleeker, flatter design. Apps that do not conform to Apple’s new aesthetic will seem out of place in the latest version of iOS.Hence, there has been a rush among top developers over the past several months to be ready with hot, new versions of their apps when iOS 7 officially ships. That day has finally come with the public version of iOS 7 available for download and installation tomorrow, September 18.But what about people who can’t or won’t download the latest version of iOS 7? Those people do exist, no matter how much Apple wants them to upgrade. Will their trusted apps stop working when the iOS 7-ready versions become available? People still use older iPhones, iPod Touches and iPads they don’t bother to upgrade to the latest iOS version—often enough because they can’t be bothered, or because upgrading would break some work-related app they need for their job, or simply because they can’t. For instance, the original iPad won’t be able to receive the iOS 7 update. Neither will any iPod Touch prior to the fifth (most recent) generation. Any iPhone before the iPhone 4 will be left out as well.Apple’s Solution For The iOS 7-lessFor users who do not have iOS 7, Apple has created the ability to download the most recent older compatible version of an app. One user on Reddit reports getting a message on his second-generation iPod Touch when downloading an app that would only work with iOS 5 or later. A test with an iPod Touch running iOS 3.1 by ReadWrite confirmed the result. This is new for Apple. In the past, if an app was not compatible with a user’s version of iOS, the app simply wouldn’t work. Yet, as the iPhone grows in age (the iPhones 5s/5c are the sixth generation), Apple recognizes the increasing importance of supporting older devices with compatible apps. The company has built-in features for developers in iOS 7 that allow them to support both the newest version of the operating system as well as iOS 6, which includes Apple’s old design aesthetic and functionality.In iOS 7, Apple has instituted auto-updates for apps meaning that apps will upgrade themselves in the background as opposed to making the user manually download new versions. Older versions of iOS will not have this feature. Tags:#app development#iOS 7 dan rowinski Why IoT Apps are Eating Device Interfaces The Rise and Rise of Mobile Payment Technology
YouTube is acquiring live video-game streaming service Twitch for $1 billion, entertainment magazine Variety reports, which would make it the largest acquisition for YouTube to date. Such a Google-owned video-game streaming service could be great for gamers, viewers and advertisers—at least so long as Google+ stays far, far away from it. (Both companies declined to comment.)Non-gamers may not understand just how huge Twitch is. The service, which lets gamers watch, play, record and stream live video game play, and has more peak traffic than both Facebook and Hulu. Just how much traffic is that? In 2013, viewers watched 12 billion minutes of gaming each month.See also: Video Games As Spectator Sport—Why Twitch Is BoomingTwitch, a spinoff of online-video broadcaster Justin.tv, originally limited itself to streaming feeds from PC gaming, thanks to the closed ecosystems of consoles. But the service got a big boost when Sony and Microsoft built Twitch support into the PlayStation 4 and the Xbox One, the gaming startup got a boost. Xbox One streaming accounts for a significant portion of Twitch’s traffic. In the weeks following the launch of the Twitch Xbox One app, Xbox One streaming accounted for 22% of unique broadcasts, which run an average of 28 minutes. That’s about as much time as streaming one sitcom on Hulu. See also: Forget PS4 vs. Xbox One—The Console Wars Have Barely Begun“Microsoft has a put a lot of time and effort into ensuring their Twitch integration would be a robust experience, and based on the amount of Xbox One owners streaming from their living rooms, the move paid off,” Twitch COO Kevin Lin told ReadWrite in April. “This high rate of adoption for our console integrations has elevated our role in the entertainment industry. People go to Hulu to watch TV, Netflix to watch movies, and now they go to Twitch to watch and broadcast video games.”Buying Twitch makes sense for YouTube. The Google-owned video site has yet to capitalize on live video—most live streams happen elsewhere and appear on YouTube once they’re done. Assuming spectator gaming continues its meteoric rise, Google could benefit from millions of eyeballs—and millions of dollars aimed at watching other people play video games.The Googlefication of TwitchThe buy makes sense for Google, but gamers might not be so happy.Typically, players use “gamertags” or usernames not associated to any personal information that don’t tie their account to a real identity. These days, Google is all about real identity.Last year Google required anyone who wanted to comment on a YouTube video to have a Google+ account, and did away with anonymous comments. The move was not well-received by the YouTube community, and YouTube ended up creating a comments page to appease users and make it easier for video creators to moderate comments on their videos. Of course, you still need to have a Google+ account to comment in the first place.If Google goes a similar route with Twitch and forces users to login with Google credentials, it’s likely the gaming community will be just as outraged as YouTube users were. Though Twitch does have the option for Facebook Login already, so some users might already be accustomed to sharing personal information with the service.Making Money Off E-Sports TalentJust like YouTube, Twitch has its own share of homegrown stars. Some of them are so good they’ve quit their jobs to play video games professionally.See also: 5 Things You Didn’t Know About Pro GamingTwitch, like YouTube, has a partner program that lets them take a portion of the advertising revenue. The select group of personalities, leagues, and tournaments has some control over the commercial breaks, and are offered multiple options to engage with the fans and community.The popularity of spectator gaming has even found its way offline. Events attract thousands of people willing to pay to watch their favorite gamers pwn in real life, and the payouts can be worth millions. YouTube is hoping its own star-making talent translates into big advertising dollars for some of the top video creators, specifically lifestyle oriented channels. With Twitch, YouTube can tap into an entirely new pool of talent, and appeal to advertisers with an audience that’s glued to watching their favorite gamers—and intermittent advertisers—for hours on end. Lead image a screenshot from Ducksauce on Twitch.tv selena larson 9 Books That Make Perfect Gifts for Industry Ex… 5 Outdoor Activities for Beating Office Burnout Related Posts 4 Keys to a Kid-Safe App Tags:#advertising#gaming#Google#playstation#Pro Gaming#Spectator Gaming#twitch#xbox one#YouTube 12 Unique Gifts for the Hard-to-Shop-for People…
Surveillance at the Heart of Smart Cities Related Posts How Connected Communities Can Bolster Your Busi… Tags:#adafruit#citizen data#citizen science#CUSP#featured#Internet of Things#IoT#QoL sensor#quantified community#Red Hook#sensor data#top#urban IoT For Self-Driving Systems, Infrastructure and In… How IoT Will Play an Important Role in Traffic … A new urban sensing platform was developed for the project, the QC Urban QoL Sensor. This is a low-cost, but reliable sensor array using the 5V Trinket Pro by Adafruit Industries. The devices measure air quality, noise, light levels, pedestrian counts, and temperature/pressure/humidity.When the sensor data is combined with administrative, mobility, social media, and Wi-Fi usage data it can create a neighborhood profile to benchmark changes over time and compare to other areas of the city. These measures are designed to help communities identify and solve problems, focusing on issues of environmental health and mobility, through new sensing modalities, analytics, and data visualization. The Red Hook Initiative (RHI), a local social services community organization, was enlisted to install the sensors and engage with the local community to provide additional, volunteered data.Four sensors were installed Red Hook at different heights. Local residents were also provided with portable sensors for a short period of time to measure temperature, noise and air quality at five-second intervals. This provided an opportunity to identify problems through data science by those that know the area best.While IoT technologies are viewed as the next generation of urban infrastructure, low-income and economically distressed communities are often not the focus of urban and civic technology deployments, and the question of equity continues to be overshadowed by expedience in many technology-led city strategies.Great opportunities for collaborationThe efforts of CUSP correlate well with other community projects such as Red Hook Wifi which aims to bring free local Wifi to residents, Hack Red Hook where participants created projects such as HighGround.nyc, a system that manages vehicle evacuation during emergency flooding situations. There’s also Open Sewer Atlas NYC, an open source mapping and data project that aims to assist in the improvement of wastewater / stormwater management by organizations with limited access to technical information and data.In regard to whether it is possible to elicit sensor data that can be utilized to influence local community or public policy, CUSP’s results have so far been somewhat underwhelming.The biggest spike in air pollution was attributed to a local community BBQ. But researchers have been successful in creating a community project that involved local residents. Their data is only from the first of the three neighborhoods to be compared and contrasted. It’s a great opportunity to compare different neighborhoods and the impact of land use, and ultimately involve citizens in smart science within their community. The Centre for Urban Science and Progress at New York University has recently released the findings from a smart city pilot study. It’s the first in-depth analysis of a local New York Neighborhood intent on measuring the quantified community. They utilized IoT sensors to collect and analyze quality-of-life measurements at high spatial and temporal resolution in the neighborhood of Red Hook, Brooklyn.Red Hook is an economically disadvantaged neighborhood. There is no subway service, only a few Internet hot spots and close to 70 percent of the population lives in New York City housing projects. Residents experience an asthma rate of more than 2.5 times that of the national average and more than a third live below the federal poverty line. The life expectancy of residents in Red Hook is 10 years lower than the national average. Red Hill was significantly affected by flooding caused by Hurricane Sandy in 2012.The measurement of quantified communities is a long-term neighborhood informatics research initiative. Researchers will study three New York neighborhoods-Hudson Yards, Lower Manhattan, and Red Hook. They aim to collect, measure, and analyze data on the physical and environmental conditions and human behavior of each neighborhood to better understand how neighborhoods and the built environment affect individual and social well-being.Between physical environment and social healthResearchers explored the correlation between the physical, environmental, and social components of the community. Baseline community measurements focused on environmental measurements including air quality, temperature, pressure, humidity, and luminosity.They collected air quality data, particularly PM2.5 concentrations (which have been shown to have a direct impact on cardiovascular and cardiopulmonary health) at the street level. Coupling this data with building energy use data allows them to explore how the quality of the built environment, human behavior, and environmental conditions interact to affect public health.Researchers explored the impact of an urban heat island (UHI). This is where man-made urban surfaces (such as concrete and asphalt) lead to an increased heating of the surrounding area. UHIs are linked to respiratory health conditions, high energy consumption and pollution. Localized temperature quantifications enable researchers to ascertain the connection between temperature, physical environment and resident health. The result can assist local authorities to identify buildings and communities vulnerable to heat waves and other heat related emergencies.Citizen science makes metrics inclusive Cate Lawrence
Related Posts Break the Mold with Real-World Logistics AI and… Tags:#American Center for Mobility#Autonomous car#Congress#Mcity#Michigan#Self-Driving David Curry A package of four bills that allow fully autonomous testing in Michigan were approved by the House of Representatives and the Senate on Thursday.The bills make it legal for fully autonomous cars to drive without a driver inside and open up 122 miles of public road for testing. The legislature also approved a plan to redevelop Willow Run airport into a test site for self-driving vehicles.See Also: Toyota steers millions into University of Michigan AI projectSomeone must monitor the autonomous car, but they don’t have to be inside. This provides Uber, Lyft, and other ride-hailing apps with the opportunity to cut the driver and have a few technicians on hand to watch for failures in the system.Michigan has been relaxing laws on self-driving far quicker than other states, with the intention of bringing Silicon Valley dollars and jobs to the Rust Belt.Many self-driving automakers already in MichiganToyota, General Motors, Ford, Nissan, and Honda are already testing autonomous vehicles or investing in autonomous projects in Michigan. Toyota also co-owns, with the other four automakers, the American Center for Mobility, the organization in charge of redeveloping the Willow Run airport into an autonomous test site.Technology companies have been less receptive of Michigan. Uber has set up shop in Pittsburgh, Pennsylvania, and both Google and Apple are sticking to San Francisco.Bringing automotive jobs back into Detroit has been an aim of Gov. Rick Snyder, who on Thursday thanked everyone that worked on the bills for ensuring that “Michigan is the new mobility leader of the world.”Michigan is not the only state pushing for relaxed autonomous vehicle laws, Florida, Nevada, Arizona, and California have all pushed laws aimed at making it easier for companies to test self-driving vehicles. For Self-Driving Systems, Infrastructure and In… IT Trends of the Future That Are Worth Paying A… 5 Ways IoT can Help to Reduce Automatic Vehicle…