Search · Best Podcasts
ECM Talk
About this podcast
A series of conversations between James Lappin and Alan Pelz-Sharpe. James is a records management consultant and trainer, Alan is an ECM industry analyst.
Episodes (Total: 18)
July 5, 2013 · 01:06:29
In this podcast Ian Meldon, a records management consultant working for the United Nations Food and Agriculture Organisation (FAO) describes FAO's e-mail based records management system. The previous records management system at FAO The previous electronic records management system that FAO operated was also based on e-mail.  From the year 2000 they had asked colleagues to copy or forward any e-mail needed as a record to the e-mail address of their local registry, where registry staff would file the e-mail in a Microsoft Outlook shared folder structure.    The system worked tolerably well, although compliance with the policy varied from area to area.  One weakness of the previous system was that all the records were kept within the Microsoft Exchange environment. People could only see the records of their local area - there was no possibility of a FAO wide search.  There was no sustainable way of holding and applying retention rules to the records. Principles behind FAO's new records management system When FAO decided to overhaul the records system they based their approach on three principles: Don’t appear to introduce a yet another computer system   FAO have procured and implemented a robust electronic records management system to use as their repository (Filenet from IBM).  But end-users never need interact directly with the Filenet repository - everything they need to do on the system can be done through the Outlook e-mail client. Don’t ask people to do something they are not already doing The idea was not to ask users to do anything more time consuming than the preivous system's demand that they copy in the registry to significant e-mails.  Under the new system every time a colleague sends an e-mail, a pop-up appears asking them to say whether the e-mail is either a) personal or trivial or b) draft/tranisitory or c) FAO record.   If an individual selects personal/trivial then the e-mail is sent without going into the records repository.  If the individual selects either draft/transitory or FAO record then they are asked to choose the appropriate 'team tag'  for the message (the team tag denotes which team they were working for in sending the message). The message then gets sent and a copy is placed in the records repository.  There is also the opportunity to mark a message as confidential if it is work related but there is a need to restrict access to it. Provide something useful beyond the need to keep records  At 10pm every night the system generates a 'digest' for each team tag.  The digest is an e-mail that lists and links to all the draft/transitory and record  e-mails sent that day and tagged with that team tag.  This means that each morning an individual can see at a glance all the significant e-mails sent by colleagues in their team the previous day.  This has reduced the need for colleagues to 'copy each other in' to e-mails.  Furthermore individuals can choose to receive digests from other teams (if they have appropriate permissions).  If a manager oversees six or seven teams they can look at the digest for the six or seven team tags each morning, without needing to be copied into hundreds of e-mails In the podcast Ian describes the evolution of records management at UNFAO over the past twenty years. Ian also discusses: the conclusions FAO drew from looking at other organisations and their electronic records management systems   FAO's functional records classification and how they linked team tags to it the role of records management/registry staff in supporting the system experiments with auto-classifcation the uses colleagues make of the system and of the digests how FAO handles incoming e-mail how FAO have responded to the challenge of mobile devices the search facility that FAO have built into Outllook that enables individuals to search the repository for emails that colleagues across the organisation have saved as draft/transitory or FAO record Ian is @MrMeldon on Twitter.  His LinkedIn page can be found at .   Ian was interviewed by James Lappin who is @jjameslappin on twitter and who blogs  at
March 12, 2013 · 00:39:23
Adrian Brown and James Lappin talk about digital preservation, and its relevance to the information challenges faced by organisations Adrian Brown is head of digital preservation and access at the Parliament Archives here in the UK.  His book 'Practical digital preservation- a how-to guide for organisations of any size' will be published in the UK and US this May. Adrian talks of a 'blurring of the bourndary' between digital objects (.doc, .jpeg, .xls etc.) and the applications that they are held in.  Key information about these digital objects are held by the applications.   Digital preservation has had some success in tackling the problem of how to preserve the file formats of the objects themselves. Now it faces a more complex problem:  how do we preserve the information that an application has about the objects it holds?  How do we enable digital objects to move from one application to another without losing that information?    Adrian describes different models for digital repositories, and different ways of tackling the preservation issues arising from a complex application such as a Building Management System.  
Feb. 15, 2013 · 00:30:33
In this episode Alan Pelz-Sharpe explains why he thinks advanced analytics tools will come into mainstream business document management and collaboration systems over the next few years.  He discusses the different things that large organisations use advanced analytics for (automatic classification; checking communications for suspicious activites and/or for compliance breaches). Alan and James adebate the question as to whether or not advanced analytics could have records management uses. James Lappin asks Alan whether it would be  feasible to: - equip an information governance tool with records management rules such as retention rules and a records classifications - get the information governance tool to apply those records management rules to content held in the many different applications used within an organisation. Alan said that it was perfectly possible, but the key limiting factor was the scope of each application's ApplicationProgramme Interface (API). If the application's API  includes provision for an external tool to perform records management tasks (for example the application of records retention rules)  then yes you could manage records in that application from outside).   But if the application's API includes no provision for records management taks then your information governance tool will not be able to manage records held within it The most important standard for APIs is the Content Management Interoperability Services (CMIS).  At the time of writing CMIS supports basic document management taks but not records management tasks This episode was recorded in London on 15 February 2013. Alan Pelz-Sharpe is Research director for content management and collaboration at Fahrenheit 451.  James Lappin blogs at
July 13, 2012 · 00:39:56
Alan Pelz-Sharpe, Sharon Richardson and James Lappin discuss Office 365,  Microsoft's cloud offering. Office 365 is a 'bundle' of cloud services, including e-mail, the cloud version of SharePoint, and Lync instant messanging. We discuss the challenges that Microsoft have in providing both cloud and an on-premise versions of SharePoint.   Sharon said Microsoft will need to improve the way it moves cloud customers from one version of SharePoint to the next.   The cloud version of SharePoint 2007 had been part of BPOS (the predecessor to Office 365). The cloud version of SharePoint 2010 was not released until nearly a year after the on-premise version. There was no simple upgrade path- instead cloud customers had to migrate over from BPOS to Office 365. The plus side was that the online version of SharePoint 2010 contains almost the whole range of SharePoint functionality (it doesn't have the SharePoint records centre though).  It includes for example most of the service applications (such as the user profile service and the search service). The main competitor to Office 365 was Google Apps, which doesn't have an on premise version. Google Apps is nowhere near as powerful and as configurable as SharePoint online, and it does not have an on-premise version.  Its features have developed very little over the past few years, and such changes as have been made have been deployed by Google without disruption to customers. SharePoint Online also faces competition from two very different sources: - The relatively simple filesharing services such as Huddle, Box and Dropbox, which like Google apps do not have on-premise versions - The powerful, complex and configurable document management products from ECM vendors such as Open text, IBM, Oracle and Documentum, which have on-premise versions as well as cloud versions and hybrid versions. Alan said that the document management products from the ECM vendors are overwelmingly deployed on-premise.  If an organisation is deploying a serious document management system, with a degree of customisation and of integration with other applications, then it is almost certainly going to want that application to be on-premise rather than in the cloud. Alan Pelz-Sharpe is Research Director of the 451 Group.  Sharon Richardson is an independent consultant and founder of Joining Dots Limited.  James Lappin is an independent records management consultant and founder of Thinking Records Ltd.
March 14, 2012 · 00:58:28
Between 2004 and 2006 Ben Plouviez (@benplouviez) oversaw the roll out of an EDRM (electronic documents and records management) system across what was then the Scottish Executive (but is now the Scottish Government). Six years later and the system contains 14 million documents and is used by around 4,000 staff. In this podcast Ben reflects even-handedly on both the benefits that having an organisation wide records repository has brought to the Scottish Government, and on the promises that the system has not fulfilled. The roll out of the EDRM was driven partly by the Scottish Executive's desire to breakdown silos between the various different parts of the administration. They  made the decision that wherever possible files would be open and accessible to the whole of the Scottish Government. There have been times when colleagues have found documentation that they would never have known existed were it not for the EDRMS.   The EDRM's Scottish Government wide business classification scheme has not been an unqualified success, but nor could it be called a failure.  It is not terribly popular with users, who rarely use it to navigate to material that they wish to find.  However on the plus side the scheme has provided a stable and enduring  structure for the system.  Ben has found that the electronic files on the EDRM system do not tell a narrative in anything like as clear or as useable way as a typical paper file used to do.  Ben questioned whether it was feasible  for records managers to expect their organisations to keep a full electronic file of every piece of work they carry out.  Ben said that the concept of the file is predicated on the concept of the document and we are now seeing alternatives to the document in the form of blogs, wikis, discussion forums, etc.  None of these new formats fit naturally into the file.  James Lappin found it significant that MoReq2010 specification used the word 'aggregation' instead of the word 'file'.  This implies that in the electronic world there are many different ways in which business communications can be collected (e-mails in in-boxes, tweets in tweet streams, etc..). There have been some unexpected benefits to having an organisation-wide records repository.  For example Scottish Government have taken information from the system's audit logs about who has read what on the EDRM and translated it into rdf triples (the non-proprietary format that underpins linked data and the semantic web).  They have then provided an interface to enable colleagues to query this data to find out what their colleagues have read on the system. This enables the serendipitous finding of documents of curerent interest, and provides a more human way of browsing and interrogating the system than that provided by either the business classification or by the search facility.  The Scottish Government have also used the same technique in relation to e-mail logs.  They have taken the records of who sent an e-mail to who and when, converted it to rdf, and provided a query and visualisation interface.  This means colleagues can find out who has been corresponding with particular colleagues or stakeholders.    Note that the content of the e-mail is not visible, only the fact that an e-mail has been sent, and only e-mails with at least one person in cc have been included to ensure that private correspendence between two people is excluded. Ben talked about the plans for the future of electronic records management in Scottish Government, including their intention to replace their existing EDRM within the next three or four years.  He speculated on whether it would be possible for one product/system to fulfill both their collabortion and records management needs, or whether Scottish Government would have to implement several different tools to deliver that vision.
March 1, 2012 · 00:53:28
In this podcast James Lappin asks Matt Mullen to explain what Big Data is.   This podcast was prompted by a blogpost Matt had written Big Data plus enterprise search = Big enterprise disappointment? Matt contrasts the vendor driven,  enterprise-centric vision of Big Data (vendors selling tools to help organisations make use of the content they have accumulated over the years in different repositories) with the more transparent, idealistic and web-centric vision of linked data (organisations marking up their structured data with rdf and making it available for others to run queries on, or to use for data mash-ups). Matt explains why it is easier for Google to make sense of the world wide web than it is for an enterprise search engine to make sense of documents and data from multiple different repositories within an organisation.  James and Matt discuss whether or not the distinction between structured and unstructured data is a meaningful one.    The podcast was recorded at the Royal Festival Hall, London on 27 February 2012 Matt Mullen is an analyst for the Real Story Group, specialising in Search and in Web Content Management. He is on Twitter as @MattMullenUK
Jan. 26, 2012 · 00:48:19
Richard Harbridge and James Lappin discuss information architecture issues within SharePoint. Richard gives his rule of thumb for answering the following question - when a new area or function comes on board in a SharePoint implementation is it best to set up a SharePoint site collection or simply a site within an existing site collection? We discuss the pros and cons of 'site collections' which are a feature unique to SharePoint.  Site collections are a hierarchical collection of SharePoint sites sharing common administrative settings and some common information archicture features such as content types.   Crucially a site collection cannot be split across seperate SQL server content databases, so there are storage as well as information architecture considerations to deciding how many site collections to set up and what for.  Microsoft recommends that each site collection does not exceed 100GB in size. James asks about the relationship between site collections and search, and Richard describes some tips for configuring a SharePoint search centre with search 'scopes' set up to enable your users to target their searches at particular site collections or at particular types of content.  We discuss the strengths and weaknesses of refiners in SharePoint search.  Refiners are a set of links that are returned alongside SharePoint 2010 search results and which enable users to filter those results by defined parameters (for instance date modified, document type, project title).  James is disappointed firstly that the SharePoint 2010 refiners only filterthe first 500 results, but more importantly that they give no indication given to the user that only the first 500 results had been refined. The discussion then touches on the managed metadata service in SharePoint 2010 as a way of getting controlled vocabularies out of the confines of a single site collection and into a place where they can be used by any site collection.  Richard outlined some of the ways in which the managed metadata service does not work as well as he would like (and mantions an article by Michal Pisarek in which these weaknesses are collected) but says he still recommends his clients make some use of it. We finish by talking about 'business connectivity services' in SharePoint.  This enables data (in the form of database rows and columns) to be imported into SharePoint from another database within the organistion. Once the data is in SharePoint it can be used as a controlled vocabulary to improve the findability of content. Richard gives the examples of a law firm importing into SharePoint a list of its matter numbers from its customer database.   The one disappointment is that the business connectivity service does not work with the managed metadata service - it is not possible to import a list (for example a list of clients) into the managed metadata service from a line of business database and use that as controlled vocabulary within SharePoint.
Oct. 14, 2011 · 00:42:38
In this podcast Brad Teed (CTO of GimmalSoft) and James Lappin discuss whether or not SharePoint can be regarded as a records management system.   Brad says that it can be regarded as a records management system with the caveat that it may not do things in the way that traditional records management systems do them.   James concedes that SharePoint 2010 has records management features (such as holding and applying retention rules, holding a hierarchical classification, locking documents down as records) but feels that these features are not brought together in a coherent enough way to justify calling SharePoint a records management 'system'. SharePoint 2010 offers organisations two different approaches to records management - the in-place approach and the records centre approach. Brad and James describe and critique  these two different approaches .   James characterises the choice between them as being like that between 'a rock and a hard place'. Brad describes the challenge of managing the routing rules necessary to get documents from SharePoint team sites to the record centre. James describes the problem of in-place records management which leaves records scattered around team sites under the control of local site owners without providing any reporting capability to give a records manager visibility over them all. Brad and James will be debating the issue of records management in SharePoint live at the SharePoint Symposium in Washington on 2 November 2011
Oct. 8, 2011 · 00:54:29
James Lappin asks Alan Pelz-Sharpe 10 questions about the current state of the enterprise content management market  Here is a flavour of some of Alan's answers - there is a lot more detail in the actual podcast itself Why have HP bought Autonomy? Alan said that most analysts were surprised at how much HP paid for Autonomy.  The best guess at what HP (a hardware company) wants to do with Autonomy (a software company) is that they may wish to create some kind of appliance which has Autonomy's IDOL search engine already loaded onto it (a bit like the Google search appliance).  One thing that HP and Autonomy have in common is that they have both bought well-regarded electronic records management systems (Tower and Meridio respectively), and done very little with them. How hard have the ECM vendors been hit by the rise of SharePoint? Alan said that the ECM vendors haven't bit hit as hard as you might think. Their revenues are still rising, and most of them enjoy good relations with Microsoft.   How does EMC and Open Text compare with the bigger ECM vendors (Oracle and IBM) Alan said that Oracle and IBM are so big because they do a huge variety of stuff as well as ECM.  But at the end of the day if you are buying FileNet from IBM you are dealing with the FileNet division, not the whole massive company. So for buyers of ECM systems company size doesn't matter that much.  Open Text is the largest company that focuses exclusively on ECM.   EMC's business is mainly about storage.  They bought Documentum, but Documentum is very different from the rest of the EMC group and there has not been many synergies. What is happening in the CRM (Customer relationship management) arena and how does it relate to ECM? Essentially ECM and CRM are seperate worlds without much overlap.  CRM is a vital tool for many organisations.  As yet there is not a great deal of tie-ins with ECM.  Oracle has both a CRM and an ECM suite, which work together reasonably well.  SAP signed a large deal with Open Text but there doesn't seem to be a huge number of organisations using SAP together with Open Text products.  Many of the CRM tools will do a little bit of document management of customer related documents, but for the most part organisations will have CRMs that don't talk to whatever ECM product(s) they have The Europeans have just revised their electronic records management specification (MoReq2010).  When will the US records management standard DoD 5015 be revised (it was issued back in 2007) Alan said he didn't know of any plans to revise DoD 5015.  SharePoint drove a horse and cart through DoD 5015 because Microsoft made the decision to release a document management product that did not comply with it but had huge market success.  Vendors didn't like DoD because it was very hard for them to tailor their products to. What is happening in the intranet arena? Alan said that nothing dramatic is happening in the intranet arena.  Some intranet makeover projects will have been hit by the economic downturn.  Alan can't understand why some organisations want to use the same product to manage there external web-site and their intranet - to him they are fundamentally different things. Do you know any organisation that manages their e-mail well? Alan said that of all the ECM implementations that he sees, the type that gives the quickest and most reliable return on investment is an e-mail archiving tool brought in to take stored e-mails off the mail servers. What do you think of PAS 89? Alan thought PAS 89 good attempt to define the scope of enterprise content management, although he can't think of what an organisation would specifically use it for. How does Alfresco compare with the proprietary ECM products  Alan said that if we were talking about open source ECM products Nuxeo should be mentioned alongside Alfresco. Both of them are established, mainstream enterprise content management systems.  The main difference between them and the proprietary ECM products is the licensing model. How does Google Apps compare with the established ECM products  In terms of impact on the ECM market Alan is more interested in Box.Net than Google Apps.  Alan and James discussed the prospect of new start ups deciding not to set up shared drives and instead using services like Box.Net in the cloud to provide a relatively simple place for colleagues to store and share documents.  
July 27, 2011 · 00:31:10
In this episode Alan Pelz-Sharpe discusses the current state of ECM in Brazil with Walter Koch .   Topics they cover include: the project undertaken by Brazilian banks to move to scan cheques and process them electronically.  Hitherto 72 aeroplanes per night have been needed to move cheques around the country - in future cheques will be imaged at the branch at which they were received, and then, once processing is complete, the hard copy will be destroyed without the original cheque having been moved the dramatic rise of SharePoint in Brazil - Alan said he went to Brazil's main ECM show in 2008 and saw virtually no mention of SharePoint.  He went to the same show in 2010 and SharePoint dominated the show.  Walter said that to accomodate SharePoint the ECM show in Sao Paulo September 2011 will split into two -  an ECM show, and a SharePoint show, both running alongside each other, both the same size.  Listening to Walter it strikes me as amazing that one ECM product (SharePoint) has grown to warrant the same size of conference as all of the rest of the ECM world put together. key ECM vendors in Brazil-  Walter says that the same big 5 companies Oracle, Open Text, IBM, EMC and Microsoft are dominant in the Brazilian market as elsewhere in the world, but that there are also some local players Walter's observations on ECM in the Middle East, and on the recent Info 360 event in the US This podcast was recorded on the 19 July 2011, and lasts for 31 minutes
July 13, 2011 · 00:52:53
In this episode analyst Ralph Gammon, author of the Document Imaging Report newsletter and blog, joins Alan Pelz-Sharpe and James Lappin to discuss the the state of the market for document capture software Capture software, such as Kofax and Captiva,  is used to make sense of scanned documents.  It is typically used to apply optical character recognition (OCR), or barcode recognition, to scanned documents.   More sophisticated use cases involve integrating a capture product with an enterprise content management system (ECM), an enterprise resource planning system (ERP) such as SAP, or a line of business (LOB) application.  The capture product might be used to identify what type of document a scanned image is, and to kick-off an appropriate workflow within an ECM/ERP/LOB application. Or the capture product might be trained to help with form processing where a large volume of paper forms are received and scanned.  The role of the capture product might be to read the entry in each field of the form and place that entry in the appropriate metadata field within the ECM/ERP/LOB, which could then trigger an appropriate workflow. Ralph identified the main value that capture software brings as reducing keystrokes- reducing the amount of manual effort needed to make scanned images of paper documents useable by an organisation on their electronic systems.  Alan points out the downside of this -  some large capture projects result in job losses. Alan said that many of his clients think that Kofax and Captiva are the only players in the Capture market.  Ralph said that many of the traditional ECM vendors have some sort of partnership with a capture vendor.  EMC (owners of Documentum) own Captiva.  IBM bought Datacap. Oracle have a relationship with Brainware.  Kofax and ReadSoft are independent of any one ECM vendor.   Microsoft are not linked with any particular capture vendor, and several vendors have worked on plug-ins to integrate capture software with SharePoint.
June 26, 2011 · 00:26:51
MoReq 2010 is the European Union's new specification of requirements for electronic records management systems. It is a radical departure, in both form and content, from previous versions of MoReq, and from other electronic records management specifications such as the US DoD 5015.02 standard (the latest version of which was published in 2007). Previous electronic records management specifications aimed to specify a system that could act as the single records repository for a whole organisation,  with users being expected to save any document needed as  record into that repository.  They created the phenomenon of the 'Electronic Document and Records Management System (EDRMS)' The EDRMS model was dealt a severe blow by the rise of Microsoft's SharePoint, which did not attempt to meet those specifications, and which took the collaboration space away from EDRMS vendors. MoReq 2010 was in many ways a response to the rise of SharePoint, to the persistence of multiple content repositories within organisations, and to the emergence of alternative formats to the 'document' and the 'file/folder'. MoReq 2010 aimed to encourage a diversity of different models for records management systems - as well as the EDRMS model it was possible for the following models to be compliant with MoReq 2010 systems that had no user interface, but which captured records saved into existing applications and repositories within the organisation systems that did not hold records themselves, but instead protected and managed records held in existing applications and repositories within the organisation line of business applications and single purpose applications that had records management functionality build into them In this podcast Alan Pelz-Sharpe said that the enterprise content management market is a global market, and most of the big technology companies are based in the US.  For MoReq 2010 to have a big impact on those vendors, it would need to have some traction and recognition within the US.  James Lappin felt that it would be beneficial for the records management community if MoReq 2010 became more influential than the existing US standard DoD 5015.02.   DoD 5015.02 included the specific security requirements of the defence and intelligence sector, which many organisations did not need.  MoReq 2010 had taken a different approach.  The core requirements included only those record keeping needs perceived as common to all sectors.  Any sectors with specific requirements (health sector, legal, defence etc.) would be encouraged to write plug-in modules to MoReq 2010 that organisations within those sectors could use to inform their buying decisions. Alan wondered whether the relative lack of publicity for the launch of MoReq 2010 in the US would harm its chances of adoption in that country. In the podcast we referred to several blogposts written about the launch of MoReq 2010 including: Alan's post Is MoReq 2010 a DoD 5015 slayer?, James's post How MoReq 2010 differs from previous electronic records management specifications Marc Fresko's post Shhh don't tell anyone This podcast was recorded on 15 June 2011 via skype
April 21, 2011 · 00:46:14
James and Cheryl started by discussing the rise of open source enterprise content management systems. They went on to discuss the impact of CMIS (Content Management Interoperability Standards). CMIS is an OASIS specification, created by a group of enterprise content management system vendors (IBM, EMC, Microsoft, Alfresco, Open Text and others). CIMS enables different content repositories within an organisation to interoperate with each other even if they are written in different programming languages.  If a vendor adds a CMIS compliant layer to their application, then other applications can use CMIS protocols to perform basic content management operations on that application. For example if an organisation installed an application that had a CMIS layer, it could allow one of its other applications to use CMIS protocols to do things such as  search it navigating around its folder structure add documents to it update documents in it  etc. James and Cheryl discussed the progress vendors had made in adding CMIS layers to their products. Cheryl recommended the blogs of Laurence Hart (Word of Pie) and Florent Guillaume as being good sources of comment and information on CMIS.  Towards the end of the podcast James and Cheryl discussed the question of whether it was either possible or meaningful to make a distinction between 'documents' and 'records'. Cheryl is the founder of Candy Strategies, and blogs at .  She is @CherylMcKinnnon on Twitter James is @jameslappin on twitter and blogs at The podcast was recorded on 21 April 2011 via skype.
March 25, 2011 · 00:50:51
Alan Pelz-Sharpe and James Lappin discussed the challenge of multiple repositories on 25 March 2011. Alan said that every organisation he had worked with had their content (documents/records etc.) spread across numerous different repositories. These repositories had typically grown up as the organisation had merged or acquired other organisations, and/or as they had added new systems for specific lines of business. At the recent Info 360 trade show lots of people had come up to Alan to ask him what they could do about the problems caused by the multiple repository issue. James said that the two obvious approaches were to either: consolidate content into one repository (Alan dubbed this 'the uber repository')  OR run a federated search across all repositories Alan was sceptical as to the feasibility of either  approach.   Migrating all content into one repository was almost impossible because: content is structured differently in each repository metadata is captured differently in each repository some of the repositories will be tailored to support specific processes and it would not be possible to tailor the 'uber-repository' to support all of those different processes. Running a federated search over each repository is no panacea either.  Lets assume you have connected the search engine to the various different repositories.  Your search engine now has the problem of understanding the way each repository keeps metadata.  And even it managed to understand the metadata in each repository, it still has the challenge of normalising across the various repositories, so it could rank and present one set of coherent search results from them all.  Alan thought you could make federated search work over three or four content repositories, but most of the organisations that he had advised had way more than four content repositories. Near the end of the podcast we discussed the prospects for a challenger to SharePoint's market dominance emerging. This podcast was recorded via Skype on 25 March 2011
Dec. 3, 2010 · 00:39:49
Alan Pelz-Sharpe came over to London in December 2010, and met with James Lappin to record this podcast. Alan had recently delivered a training course where he dealt with the traditional enterprise content management suites (Open Text, Documentum etc.) on day one, and Microsoft's SharePoint on day two.   We discussed whether or not SharePoint can meaningfully be called an enterprise content management system, and conclude that it can, albeit with significant points of difference from the traditional suites. Alan expressed scepticism about the impact of cloud and mobile devices on enterprise content management buying decisions in the short term.  His experience had been that whereas vendors of cloud services were making a lot of noise about the cloud, the organisations that he helps with buying decisions were not yet considering in the cloud.  Alan said that he could not see business critical information moving to the cloud any time soon. We also discussed: The future prospects for Microsoft Office The challenges posed for the implementation of retention schedules by the fact that cloud providers replicate information across different servers and different datacentres to maximise availability and up-time.   How would it be possible to guarantee that all instances of a particular document or piece of information had been deleted?
Sept. 30, 2010 · 00:51:06
Lee Dallas is one of the brains behind Big Men on Content - an incisive independent blog on the world of enterprise content management, which Lee created together with Marko Sillanpaa. Lee works for EMC on partner system engineering. In this episode Lee discusses the future of enterprise content management with James Lappin.  Lee said that the rise of SharePoint had been the major trend of the last five year.  But the major trend of the next five years would be the need for enterprise content management systems to react to the challenge of mobile devices.  Lee explains why transitioning ECM systems to smaller mobile devices is far from straightfoward. James and Lee discussed the limitations of the 'document' as a format, and speculate on what formats might emerge to improve on it.  This gave Lee an opportunity to explain his view that in IT, nothing ever dies. This podcast was recorded on 30 September 2010
July 30, 2010 · 00:49:33
On the afternoon of 29 July 2010 Angela Ashenden (Principal analyst at MWD) met with Alan Pelz-Sharpe and James Lappin  They discussed how collaboration and social computing fit in with enterprise content management. Should organisations seek to use the social computing functionality of their preferred enterprise content management vendor (SharePoint, Documentum, Opent Text, Oracle etc.), or are they better off with a niche social computing vendor?  Why are organisations more likely to introduce a system for collaboration than for records management?
July 29, 2010 · 00:45:03
To kick off the ECM Talk podcast series (and to take advantage of the fact that Alan was over in the United Kingdom on holiday), James Lappin and Alan Pelz-Sharpe met up in Winchester on the 29 July 2010 to discuss records management. James argued that the problem records managers hav is not convincing people that records management is important, but rather in convincing organisations that there are feasible steps that they can take to improve their records management. Alan talked about the danger of organisations assuming that if they purchased software that met standards such as MoReq and DoD 5015 then their records management needs were covered. We went on to discuss the pros and cons of e-mail archiving systems - with James questioning whether an e-mail archiving system can function as a useable information resource within an organisation.
Listen Notes
Podcast search engine with 353,006 podcasts and 19,369,412 episodes. Learn more.