Forrester analysts will host a “Tweet Jam” on February 10, 2010, from 1:00 – 3:00 PM ET to answer questions from Business Process professionals and App Dev professionals about top challenges facing their process improvement initiatives. During this interactive Jam session, Forrester analysts will share the results of our groundbreaking “Business Process Professional Role Deep Dive” research that uncovered major trends and critical challenges facing aspiring process improvement programs.
Key questions we will tackle during this Tweet Jam include:
1. Which role(s) should lead your business process initiative?
2. What are the best practices for establishing your BPM COE?
3. Do yourtraditional business analysts have what it takes to drive BPM initiatives?
4. How heavily should you rely on your software vendor for project implementation?
5. How should you connect your EA and BPM initiatives?
6. Which process improvement methodology (Six Sigma, Lean, TQM) is best for your initiative?
7. How should you incorporate BPMN modeling into your process initiative?
8. How should you measure the progress or success of your process initiative?
9 What’s the typical sizeand composition of process improvement teams?
10. How should process improvement connect to master data management?
11. How do you think Social BPM will impact your organization?
The session will be hosted by Clay Richardson, Connie Moore, CraigLe Clair, Alex Peters, John Rymer, and Ken Vollmer. To join this interactive conversation, simply tune in to the #bpmjam hash tag on Twitter or follow the analysts that will host and moderate the session.
In my ongoing work with clients, I try as often as possible to stress the importance of flexibility in GRC programs. Internal processes and technology implementations must be able to accommodate the perpetually fluctuating aspects of business, compliance requirements, and risk factors. If GRC investments are made without consideration for likely requirements 1 to 2 years down the road, decision makers aren’t doing their job. And if vendors don’t offer that flexibility, they shouldn’t be on the shortlist.
News outlets over the past year have given us almost daily examples of change in the GRC landscape. The recent stories coming out of Davos have been no exception... giving us some truly fascinating debates on the necessity and detriment of regulations. As quoted in a Wall Street Journal article on Sunday, Deutsche Bank AG Chief Executive Josef Ackermann argued against heavy-handed regulation, saying, "We should stop the blame game and we should start looking forward... if you don't have a strong financial sector to support the this recovery... you're making a huge mistake and you will regret that later on," he said. French President Nicholas Sarkozy summed up the opposing argument in his keynote, explaining, "There is indecent behavior that will no longer be tolerated by public opinion in any country of the world... That those who create jobs and wealth may earn a lot of money is not shocking. But that those who contribute to destroying jobs and wealth also earn a lot of money is morally indefensible."
Gene briefly explores the misunderstanding between “Enterprise IA” and “User Experience IA.” This tension was well characterized by Peter Morville almost 10 years ago (See “Big Architect, Little Architect.” Personally I think it’s clear that content is always in motion, and unsupported efforts to dominate and control it are doomed. People are a critical element of a successful IA project, since those who create and use information are in the best position to judge and improve its quality. Many hands make light work, as the saying goes.
For example, if you want a rich interactive search results page, you need to add some structure to your content. This can happen anytime from before the content is created (using pre-defined templates) to when it is presented to a user on the search results page. Content is different than data, a theme Rob Karel and I explored in our research on Data and Content Classification. For this reason, IA is both a “Back end” and a “Front end” initiative.
The first reports on the IT market in Q4 2009 are now in, and they are in line with our prediction that the tech market recession ended in that quarter (see US And Global IT Market Outlook: Q4 2009). Overall, the tech market in Q4 2009 was more or less flat with the same quarter the year before – an improvement from prior quarter when growth was negative, and evidence that the 2010 tech market will post positive growth.
The US economy was stronger than expected, by 5.7% real GDP is an aberration. The US Department of Commerce released preliminary data on Q4 2009 economic growth, and the results was a surprisingly strong 5.7% in real GDP, 6.4% in nominal GDP from the previous quarter (on a seasonally adjusted annualized basis). However, about two percentage points of that growth was due to inventory re-stocking, which will not be repeated in future quarters. And based on prior GDP reports, this growth rate will probably be revised down as new data comes in. (In Q3 2009, the growth rate in real GDP started at 3.5%, but ended up revised down to 2.2%.) Still, this report confirms that the US recession is over, and slower by steady growth is likely for the rest of 2010.
Security Researchers in the UK say that the 3-D Secure (3DS) system for credit card authorization, a protocol that was "developed by Visa to improve the security of Internet payments," has significant security weaknesses. It is used by both of the ginormous card brands, known as "Verified by Visa" and "MasterCard SecureCode."
This could be a big deal.
In a recent paper, the researcher calls out 3-D Secure as a security failure that was pushed on consumers by financially incentivized merchants because, "its use is encouraged by contractual terms on liability: merchants who adopt 3DS have reduced liability for disputed transactions. Previous single sign-on schemes lacked liability agreements, which hampered their take-up."
According to the authors:
"3-D Secure has lousy technology, but got the economics right (at least for banks and merchants); it now boasts hundreds of millions of accounts. We suggest a path towards more robust authentication that is technologically sound and where the economics would work for banks, merchants, and customers - given a gentle regulatory nudge."
We’ve become curious ever since we interviewed Linda Cureton of NASA a few months ago, when we were a bit surprised to discover that she has an active blog (her Thanksgiving entry implores CIOs to give thanks to their “geeks”). And there’s Rob Carey, CIO of the Navy, who has been blogging for the past two years. So we decided to look around to see other CIOs who are actively blogging. Active implies recent — which takes quite a bit of time and thought, and is probably not for everyone. So who else besides Linda takes the time and thought? Here are a few who do, though not always frequently.
The market for enterprise carbon and energy management (ECEM) systems continues its rapid evolution. Since publishing our Market Overview report last November, we have interviewed at least a half-dozen additional systems providers coming into this nascent market.
Last week we talked with Dan DeKemper, a director at Pricewaterhouse Coopers who works with the firm's 800-person-strong sustainability practice on large-scale ECEM implementation projects. Dan told us that PwC sees three industry sectors driving ECEM adoption:
Utilities and Energy, the traditional "heavy emitter" industries that are focused on monitoring and reducing carbon emissions for regulatory compliance and public perception reasons.
Retail and CPG, two verticals where adoption is now growing faster than Energy. These companies are implementing ECEM on a voluntary basis, looking to improve brand equity and align with sustainability initiatives of some of their customers like Walmart.
Public sector organizations, looking to be role models for the private sector and also under executive or legislative mandate to improve energy efficiency.
Several clients have recently been asking about "Virtual Network Segmentation" products that claim to segment networks to reduce PCI compliance. They may use ARP or VLANs to control access to various network segments. These type of controls work at Layer 2 and the hacker community is well versed at using tools such as Ettercap or Cain & Abel to bypass those controls. We've recently written about Network Segmentation for PCI as part of the PCI X-Ray series.
While rereading the PCI Wireless Guidance document, I came across this nugget that puts a nail in the coffin of using VLANs as a security control:"Relying on Virtual LAN (VLAN) based segmentation alone is not sufficient. For example, having the CDE on one VLAN and the WLAN on a separate VLAN does not adequately segment the WLAN and take it out of PCI DSS scope. VLANs were designed for managing large LANs efficiently. As such, a hacker can hop across VLANs using several known techniques if adequate access controls between VLANs are not in place. As a general rule, any protocol and traffic that is not necessary in the CDE, i.e., not used or needed for credit card transactions, should be blocked. This will result in reduced risk of attack and will create a CDE that has less traffic and is thus easier to monitor."
Finally, Apple’s latest game-changing, must-have device is ripe - the iPad. The iPad is not a new idea. Tablet PCs were introduced years ago but failed to take off. More recently, the Amazon Kindle proved that a simpler form of the tablet has legs. But what Apple does brilliantly is that they do it better.