Records Management and Integration with ECM Application

Posted by

Curious about what systems/strategies for records management and Records Retention Schedules the AIIM community has found to be most effective. How successful have you been integrating the management of a Record Retention Schedule into your ECM application? How about the management of your physical records? Have you integrated your Records Management System (for physical records) into your ECM application?

—————————–———————
Alyeska Pipeline Service Company
—————————–——————-—


I have implemented OpenText Content Server with RM for dozens of clients over the years including Physical Records. I find it to be intuitive and it works well.

We also work with SharePoint, Laserfiche, Filenet, MFiles, and a number of SAAS/online providers like SpringCM and Box. My opinion is OpenText leads the pack in this regard.

Cheers,

—————————–——
IQ Business Group, Inc
—————————–——


Richard is absolutely correct that OT (OpenText) is, without a doubt, the forerunner. That said, undertaking an OT implementation, and sustaining it at both technical and functional levels at sufficient levels that the organization can truly reap solid benefits is not for the faint of heart. Or of pocketbook!

And since OT purchased Documentum that is no longer an actual alternative.

Having done a FileNet implementation, I will be as kind as I can be and just politely suggest looking elsewhere……

As far as SharePoint, that platform’s greatest strength is also, in my opinion based on years of experience, knowledge and training with it, it’s greatest weakness. And that is the fact that to really make SP able to deliver depth of capability in virtually any area requires the purchase of (at dead minimum) 1 significant 3rd party addon from the vast ecosystem that has grown up around it. Seldom do I see or hear of situations where just 1 addon can truly deliver sufficient value though from both the functional use side, and the administration of the system side (technical and functional admin). And that’s where the “platform” design intent strength behind SP can quickly become a somewhat daunting weakness. In particular, I have a significant issue in terms of using SP natively for RM. The issue being that one of the major functional ‘objects’ in the software is a thing called a Content Type.

Content Types are essentially a container with a label that reflects your information architecture. Off of these hangs metadata, workflows, templates, and other aspects. That part is great. What isn’t great is that in order to do RM with these Content Types you are essentially forced to structure them to reflect your retention schedule architecture. Which then means you have no choice but to force the users to use that same IA from an ECM perspective (read as ‘productivity’). The only use case I’ve seen where this is perfectly acceptable to the users is for a gov’t archives dep’t (municipal, state/provincial/federal). Other than that, it is highly counter-intuitive for users to classify their content that way. It’s not how they USE it for business purposes. But that is where the 3rd party addons come in. Most of the RM addons create a logical, and in some cases down to the database level, separation between the IA that the business uses and the IA that the RM dep’t/function uses. But, that obviously means you have to buy an additional software(s) from an additional vendor(s) to deal with that aspect.

I haven’t done anything with LaserFiche, MFiles, or Box so I will defer to others with regards to those.


Agree with Lorne generally (see specifics in my other post here). One addition, however: With O365/SharePoint Online, you no longer have to rely on content types for content lifecycle management. You can still use that approach (and all the cumbersomeness it entails), but you can also use retention labels to accomplish the desired result. Recently, Microsoft introduced a set of file plan management capabilities (again, only for O365) that make it all easier to use.

There are some considerations/limitations even with the updated native Microsoft approach, specifically:

. Only works against Microsoft cloud-native content sources – Exchange Online, SharePoint Online, OneDrive for Business, Groups (which is really SharePoint). Teams support is in the works.
. Doesn’t address physical records management at all. You can roll your own with lists or physical record content types, but it’s cumbersome.
. Requires E5 tenant-wide licensing to turn on the required security and compliance features that make it possible.

In my experience, Lorne’s exactly right – you do need at least one 3rd-party add-on to SharePoint to get the job done. I listed two options for the records lifecycle management element in my other post. I generally also recommend three other categories of add-ons:

.Tools to help users with supplying metadata for content at creation or update time (e.g. – Colligo, Harmon.ie, Repstor, etc.),
.Tools to allow users to take advantage of the existence of metadata via Search (e.g. – Lightning Conductor, etc.), and
.Tools to standardize and automate the provisioning of new filing locations (and associated default metadata) based on defined patterns (e.g. – PnP provisioning).

If you go hog-wild and buy all of those things and any number of other gew-gaws, you’ll have plenty of money left over from what it would have cost you to acquire, use and manage any of the legacy platforms.

——————————
OfficeOptimus, LLC
——————————


Just a note of caution about the O365 “Labels” feature:

Because they are just a label they do NOT make your records immutable the way that a product like RecordPoint, Gimmal, or Collabware does.

Nor do they provide the ease of administration, auditability/reporting, or automation that those 3rd party products do.


Mitch, Labels, including those with the Record behavior enabled, can be applied automatically with the newer Advanced functionality. In addition, the Advanced functionality is now available as an add-on to E3, so E3 customers can get it for a little more cost, but without having to pay for E5.

https://www.colligo.com/blog/office-365/how-office-365-machine-learning-will-change-records-management/

https://www.computerworld.com/article/3331897/microsoft-windows/microsoft-spins-off-security-and-compliance-bits-from-microsoft-365s-priciest-plan-to-upsell-e3-cus.html


Yes, there is some auto-classification functionality available now in O365/M365. I’ve tried it out a bit and, being honest, it has a long way to go (in MY opinion, obviously) before it reaches the capabilities in the leading 3rd party products that have been listed in this discussion. And, I think it will be quite a while, if ever, that MS extends it beyond the O365/M365 border to other applications and content stores ranging from ERP platforms (that generate lots of records and have significant records viewing/recall requirements for the users) to CRM to PLM, to Geo and so on.

Given that, I still don’t see it as a “serious” solution. Not today.


Hi,

That is a loaded question that integration can vary greatly from client to client, but yes, we have done a number of OpenText and O365 integrations for various components, most remarkably with email autoclassification.

In general, there are components (AGA, Enterprise Connect) that are pretty straightforward and are complete packages as modules on the OTCS side, there are also 3rd party modules. But the capabilities there are centered on users that are in OTCS on a day-to-day basis. If OTCS is more just your backend, the auto classification and archiving may be the angle you want to pursue.

Hope that helps.
Feel free to ping me directly or start a new thread on this.

Cheers!

—————————–——
IQ Business Group, Inc.
—————————–—–


In order to integrate OT to SP, you require a module from OT called AGA (Application Governance and Archiving). The module was developer collaboratively between MS and OT. It is infamously challenging to design and configure that interplay between the platforms. Not so much at the technology level (that’s actually moderately straightforward). Rather, from both information architecture and governance aspects.

Now, IF you can climb those hills, then you have a very strong information management backbone!

I know that, as of earlier last year, in order to use SPO (SharePoint Online/Office 365), you HAD to go to the OT cloud which is an OT managed service PaaS type offering. That may have changed by now. You would need to confirm with OT directly. If that situation has not changed, and you have an on-prem OT environment, that means you would have to go through a migration to their cloud. Time and budget and change management, obviously. The other option that OT presented (at that time) was to have SharePoint in a hybrid configuration where OT would integrate to the on-prem SP environment.

Have you ever watched the movie ‘Highlander’? The original, with Sean Connery, not the imitations. It’s the origin of what I call the ‘Highlander Theory of RM’, which states that – where the record lifecycle management policy is concerned – “There can be only one”.

OpenText’s view of what that means is that since OTCS is obviously the master of the universe from a policy definition and management perspective, all content must live in OTCS.

That hasn’t worked out so well in practice, apparently including in your shop.

The alternative perspective is that of ‘Federated Records Management’, as practiced most notably by Gimmal Records Manager (GRM) and RecordPoint Records365 (R365).

Think of either of those platforms as the ‘single point of policy expression’ that span multiple in-place content repositories via a single pane of glass. Remember: There can be only one!

Gimmal says that they can federate with OTCS directly as a managed content source. That means that you would extract the lifecycle management rules from OTCS and put them in the GRM console, but leave the OTCS-resident content where it is. GRM would then be used to manage lifecycle across both OTCS-resident content and all other content sources in your enterprise, using a consistent set of policies.

Records365 doesn’t support native OTCS as a content source today, but the connector landscape is always changing. Both platforms support most of ‘the usual suspects’ when it comes to content sources, and both provide comprehensive support for the things that RM’s like to see, such as disposition approval workflows, destruction certificates, transfers, audit trails, etc.. Microsoft OOTB remains a bit behind on those things, and federation with non-MS sources isn’t high on their priority list.

Hope this helps. Good luck!

——————————
OfficeOptimus, LLC
——————————


Hi,
Does anyone have experience with integrating an on premise OpenText Content Server, currently only available internally and used for daily document management, with OpenText Core (cloud service)?

The goal would be to have an environment where folders from OTCS can be ‘published’ to Core and remain synced with the internal system, so we can collaborate flexibly on the documents with external users, while keeping our main repository complete. Obviously, this needs to be very secure.

As a bonus, as we use MS Office 2016 internally and not O365, using Core would enable our users to do co-authoring.

Any experiences?
Thanks

SCK•CEN


https://core.opentext.com/support/categories/cs


Thanks, Ritch. I’ve been waiting for the restriction on auto-application of Record labels to get lifted. Good to know it has.

Great articles. Looking forward to the promised arrival of the ability to build auto-application rules that can incorporate Boolean selection based on searchable attribute values, not just keywords from body text.

Missed you at the SWC chapter lunch last week. Good session. Thanks.

——————————
OfficeOptimus, LLC
——————————


Well, Mitch, you’re in luck. Within the below link, “You can auto-apply labels to content that satisfies certain conditions. The conditions now available support applying a label to content that contains specific words, phrases, or values of searchable properties. You can refine your query by using search operators like AND, OR, and NOT.”

Yep, I had signed up up, but then I had to present on what we’re doing with machine learning here. Next time…


Hmmm. Not seeing it deployed in my tenant as yet, but that’s not unusual. Have to be patient. Have you had a chance to work with File Plan Manager as yet? That’s also an elusive one in the wild.

——————————
OfficeOptimus, LLC
——————————

I don’t think that’s been actually released yet.

“Coming attraction”?


In 2014, I was tasked to take our office from a completely unorganized shared drive to implementing electronic records management in our recently purchased ECM System (Laserfiche). I created the repository, folder structures, procedures, working as a team with an implementation committee and test pilots (two people from our Legislative team), continuing with training and assisting each department in getting organized and bringing their documents into our new repository and overseeing quality control, etc.

I can say that we did not have any system or strategies that we used. All of my research was either too confusing or their examples were too complex for what our small, independent office of approx. 30 office workers wanted. Having said that, I had not been introduced to AIIM when I started. AIIM now has some very good material that may be used to review and decide on a strategy to follow.

We did decide to go with a bucket style retention schedule, instead of a complex coded classification system.

I did learn the value of having full support of the head of the ORG and my committee for implementing the new system. We have achieved a successful, valuable end result when many of our similar sized municipalities did not and the reasons that seem to be most obvious are: Full support of the leadership (including department supervisors); dedication to the project in assigning someone to work on this full time; careful testing prior to implementation, and quality control on incoming documents right from the start.

Our EDMS does not take into consideration physical records, although I have seen a demonstration on how that can work. All our older physical records (with a few exceptions) are handled as per old procedures. Since implementation of our EDMS, all new records are scanned or otherwise brought into the EDMS and are then considered the official record. Physical copies are temporarily held, just until it has been ensured they are safely in the system and on the backups. Exceptions are: all current bylaws, agreements, and land information (we are a local municipality) were scanned into the repository, as well as the physical copies being retained as per the retention schedule.

—————————–——
Sedgewick, AB CANADA
—————————–——

You may want to look into HPE Records Manager. You can include records retention dates for documents, files and warehouse boxes. It can easily upload and send electronic documents. Physical records can be processed and indexed to the database or you can scan and uploaded the electronic version. It has excellent multiple functionalities.

—————————–————————————————
Office of the Commissioner, Major League Baseball
—————————–————————————————


Yes we control all our physical files (20,000+ boxes and thousands of hard copy files) within our EDRMS, which is Technology One’s ECM – we register archive boxes with retention & disposal schedules in line with legislation, and control all movements of these records within our EDRMS. We also track movements of records that have left our physical archive for transfer for the state records office (as our archive is also historical and goes back to 1872). These archive boxes sit alongside the 3.4 million records that we hold within our digital archive, i.e. a physical archive box, once registered, itself becomes a ‘digital record’ and this contains all the necessary metadata for that physical record within a digital format. We have two indexes within our EDRMS, one for ‘Archive’ which is for physical archive boxes, and a ‘Hard Copy Files’ Index which manages the individual files (which are then logged against a box, etc). We have other indexes including Leases, Agreements, etc for other types of physical documents that, due to their inherent characteristics, we have to keep in physical form for compliance purposes (rather than digitise them and dispose of the original), and then a whole host of indexes relating to other born-digital records. The metadata collected around physical archive boxes upon registration focuses mainly on legislative compliance requirements. Every physical box archived also has a records form detailing every item in the box (including reference numbers for individual files which, depending on class, are also sometimes scanned into the hard copy index), and this records form itself is a digital record that is registered as part of the metadata against the archive box. This ensures the complete integration of our physical and digital archives within the one system. I hope that helps.

—————————–—————
Shire of Yarra Ranges Council
—————————–—————


I would concur with most of what has been said, so I will avoid repetition. There are a number of ECM products available, which can range from the fantastic to the merely adequate. However, how one assesses the best fit for your organisation might be different to the criteria used by another organisation. I’ve used Objective ECM (for many years), and have had exposure to OpenText, Content Manager (formerly HP Trim, HP Content manager, and now Microfocus Content Manager), and RecFind to name a few.

I’ve learnt a few things from my implementation experience – including, the importance of finding someone who (a) knows the product in detail, and so can help avoid obvious pitfalls and (b) who won’t simply copy the system configuration from another organisation, particularly one unrelated to your own.

My rule of thumb – first implementation is your learning implementation. The second implementation is where to validate and apply what you’ve learnt. By the fourth or fifth implementation, you’re starting to have a good understanding of implementation issues and product configuration. However, most people don’t have the luxury of multiple attempts before the current one. So find someone who has the experience and draw heavily on their experience.

—————————-————-
Foley Business Consulting
—————————–———-


.As noted elsewhere in this thread, OpenText is great stuff, but it’s expensive to acquire, implement and operate. If you don’t already have it, don’t go there. If you’re like most, then you probably already have SharePoint, either online or on-premises. (If you don’t (have SharePoint), then you’ll need something other than a file system, and you’ll probably end up with SharePoint (probably SP Online).)

If that’s right, then you’ll need to manage lifecycle in SharePoint, and you have essentially two choices:

.Use a combination of a) OOTB Microsoft tools for managing electronic content lifecycles, and b) roll-your-own physical records lifecycle management, preferably inside of SharePoint, or
.Use either Gimmal Records Management or RecordPoint Records365 to provide integrated lifecycle management of both physical and electronic records, online or on-premises.

I’ve done all three. The advantage of the first approach is that you’re only dependent on Microsoft, but you’re taking on a larger development and support burden. Either of the third-party options are probably less expensive (vs. rolling your own), but they create a small-company dependency. With a few compromises, it’s possible to create the option of falling-back to pure Microsoft in the event that the third-party option runs out of gas.

All of these options allow you to easily represent your retention schedule in the form of a file plan that serves to enforce your retention and disposition policies against in-place content across multiple repositories. Some are easier to use than others, but all produce the same ultimate result.

Here’s what I know from experience will NOT work: Asking users to do anything over and above what they’re already doing with their content in order to satisfy content management lifecycle requirements. Examples of such ‘unnatural acts’ include asking them to declare records, or enter records-specific metadata, or (worst of all) submit their content at any point in the lifecycle for management in some ‘foreign’ system. That last one is the most common use case for OpenText (and other legacy systems) in the real world, and it just doesn’t work. You end up with an ‘iceberg effect’ – the tip of the content iceberg that you can see is in the ‘system of record’, but the dangerous part of the iceberg – the part that will sink your ship – is below the waterline, hidden in various systems of engagement that are NOT accessible to your RM policy.

It’s best to avoid that situation, and you can only do that by applying lifecycle policy to content in-place, where the users want to use it. Good luck!

——————————
OfficeOptimus, LLC
——————————


Totally agree with Mitch. If your solution (being the combination of the technical pieces plus the information architecture and configuration) can’t manage the RM aspects leveraging what users are willing to do for their content management productivitypurposes, you are very likely to fail. And, again as Mitch pointed out, you may not even know how bad you’re failing until something happens and you need to have authoritative, current, accurate, verifiable, etc. records.

I would like to thank Mitch for his post, as I have had similar thoughts and almost did say some similar things. Although I was hired to take one of those legacy systems that Mitch talks about, create the repository and all that it requires, and do what is necessary to achieve a successful end result, the whole process was so painful that at one point I questioned the need. We are a smaller municipal office. We had to get organized before migrating into the new repository anyways. If we had just done what was needed to get organized on the shared drive and then followed some of the suggestions that Mitch outlines, it may have cost a lot less. I don’t know, as I’ve never used SharePoint.

We did also run into the issues Mitch talks about with getting users to do what was asked, however, (1) we have good support from our CAO and (2) we assigned “champions” for the repository in each department and trained those champions to be the “go to” person for their workmates (in most cases, one of the dept’s administrative assistants. They are responsible for ensuring that all their department’s records end up in our EDMS and responsible for making it as easy as possible for their workmates to find their records using the tools that I provided for them (ie, templates, Saved Search configs, Saved Column Profiles for sorting via columns of metadata, with ongoing training for these champions on how to use these tools, and using automation wherever we could see possibilities. Most recently, our annual audits being completed by an independent third party is now happening via the records in our EDMS, which ensures that these “champions” know they absolutely must have their records in the system in the manner that we have trained them.

Thanks for your comments, Mitch!

—————————–———
Sedgewick, AB CANADA
—————————–———


My pleasure, Connie. And congratulations on navigating the minefield. If I may ask, which EDMS did you end up using? You can’t do the things you mentioned with a mere file system, so you must be using something.

You have to be ever-vigilant, though. Those users are always coming up with new and ‘interesting’ (in the Chinese curse sense of the word) ways of working with content. Every bright shiny object that emerges from the cloud seems to attract someone’s interest – either internally or seduced (er…’introduced’) by an external stakeholder.

Again, you can either try to fight that trend (which will often fail, especially when it’s a senior executive who’s chasing the shiny object), or you can surf it.

To do the latter, you have to have ‘federated content management’ capability. And that’s where you have to part company with a pure Microsoft OOTB approach.

Both RecordPoint and Gimmal have the ability – via ‘connectors’ – to actively integrate content that resides in multiple different source systems within a single, consistent lifecycle policy-management framework. If that’s a problem you face, then you might want to take a look at them.

Best of luck,

——————————
OfficeOptimus, LLC
——————————


We are using Laserfiche and quite satisfied with it! It has many awesome tools that we are attempting to make the most of. And easy to learn!

Being ever-vigilant? Absolutely! One of my daily, or at least weekly tasks is quality control. I literally view every document coming in (we’re a small org!), looking for glitches, ensuring the naming convention is being followed, that templates are assigned, and spot check for other things. One a year, I complete searches looking for anything missed, duplicates, misfiled (or not yet filed!) items, or missing records and talk to those department champions so that they are taking care of these items before the end of the year, etc. We have business processes (automation) that watch the incoming mail to ensure that they are handled quickly and the people that need to know about the new mail are on top of it, etc.

And the most important thing that is helping us ensure that records are handled appropriately is the paperless processes that Laserfiche’s Forms/Workflows/ Business Process tools can be used for. Entire processes have been revamped and turned into automation, where the whole thing(s) starts with a form, sometimes started by the public via our website forms, maybe a County landowner that wants to participate in a program the County is offering (that upon submission the application is automatically saved in LF) and all staff responses are handed via forms (and automatically saved in LF) and final report forms that finish off the processes (and automatically saved in LF), and operational/program report summaries are populated via LF Reports without anyone having to compile them by hand. The more paperless/automated your process is, the more you can ensure accuracy and completeness in your records and your project end/yearend reports.

Thanks, Mitch!

——————————
Sedgewick, AB CANADA
——————————

Thanks. Yes, LF is a fine product, and it sounds like you’re using it effectively. In a situation such as yours, it can work well. Good to hear that’s the case for you.

Hopefully this exchange will prove useful for other AIIMers as well. Thanks for contributing to the collective consciousness!

——————————
OfficeOptimus, LLC
——————————

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.