Want to know more about Access Certification and Attestation? Would you like to win an iPad and get a courtesy copy of a Forrester report on the findings of a survey on the topic?
Forrester is collaborating with the University of British Columbia (UBC) on an Identity and Access Management survey. The main topic of the survey is Access Certification and Attestation, also known as Access Governance. It takes only 15 minutes to complete the survey. In August 2013, Forrester, in collaboration with UBC, will publish the highlights of survey results.
Conversations with vendors and IT end users at Forrester's Security lead us to predict that XACML (the lingua franca for centralized entitlement management and authorization policy evaluation and enforcement) is largely dead or will be transformed into access control (see Quest APS, a legacy entititlement management platform based on BiTKOO, which will probably be morphed by Dell into a web SSO platform).
Here are the reasons why we predict XACML is dead:
Lack of broad adoption. The standard is still not widely adopted with large enterprises who have written their authorization engines.
Inability to serve the federated, extended enterprise. XACML was designed to meet the authorization needs of the monolithic enterprise where all users are managed centrally in AD. This is clearly not the case today: companies increasingly have to deal with users whose identities they do not manage.
PDP does a lot of complex things that it does not inform the PEP about. If you get a 'no, you can't do that' decision in the application from the PEP, you'd want to know why. Our customers tell us that this can prove to be very difficult. The PEP may not be able to find out from the complex PDP evaluation process why an authorization was denied.
Not suitable for cloud and distributed deployment. While some PEPs can bundle the PDP for faster performance, using a PEPs in a cloud environment where you only have a WAN link between a PDP and a PEP is not an option.
A common theme during this week's SAS and FICO user conferences was how to use Big Data to make fraud decisions faster, more accurately and without impacting the customers in any negative way.
Big Data is basically about 3Vs: Volume, Velocity and Variety of data to gain veracity and value in fraud management. Volume and Velocity are nothing new: fraud management products have long been capable of analyzing terabytes of data in billions of transactions - in real time.
What's really new for Fraud Management about Big Data is Variety. Using all types of new information to make better decisions with lower false positive rates. The new data sources that are increasingly used in Fraud Management are:
Social network data. Has this user been writing about committing fraud on Facebook? After seeing how dumb some criminals can be, this data source is pretty important.
Geolocation of a mobile devices. The fraud management system should warn ahead of time if a user has been in the same location as the ATM when he/she used her ATM card to empty her bank account)
Identity and Access Management systems logs. The fraud management system should warn ahead of time if the authentication system in front of my customer facing system see any evidence of the user logging in from a risky geography or from a new device before the user emptied their bank online by making unauthorized transfers to a mule account)
Textual and unstructured data. The fraud management system should warn ahead of time if, for example, a medical provider or insurance adjustor is always using the same combination of terms of "suture removal" or "rear hit accident" in suspicious contexts or just in an excessively repeated way)