DAM Champ: Kathleen Cameron
DAM Champ Kathleen Cameron is the digital asset manager at Nest. She received her BFA from the San Francisco Art Institute in 1989 in film and photography. She was always more interested in process than end product which led her to work in photography archives and eventually to an MLIS. Through focusing on users’ requirements and analyzing outcomes, she has developed best practices and sustainable procedures for the best utilization of digital asset management tools and services.
DAM champ: Someone who supports finding, setting up, or maintaining a digital asset management system (DAM). There is a wide variety of DAM champions, who come from positions in production, creative, management, IT, and marketing.
What is your current role in digital asset management?
I am the digital archivist here at Nest and I’ve been here for about a year. That’s part traditional digital asset management and part content management, so looking at the full lifecycle of content there.
Has digital asset management been your thing since the start?
It’s been my thing for a long time. In 1992, I was working for an analog photo library, and that’s when I first started exploring digitization and digital asset management tools. There weren’t a lot of choices back in 1992, but it wasn’t very long after that tools started becoming more readily available.
What is your background?
I have a bachelor of fine arts in film and photography, and when I finished my undergrad degree I started working at the Bettmann Archives in New York. I worked there doing research, so I had a sense of how people were using photography. From there I went to manage the collection at Archive Photos, and that’s where I started looking at digital asset management. Then after that, I moved back to the Bay Area and started playing around with metadata, figuring out how people were managing their collections, and I set up a few FileMaker Pro collection management systems for nonprofits.
Then I worked as a photo editor for a long time. And that was the first time I worked with a digital asset manager. I learned a lot from her in terms of digitization and metadata, and then eventually started deploying DAMs other places. I did a full DAM implementation for Quokka Sports, which was the first place to create nbcolympics.com, so I created the taxonomy for that. I did the full implementation, from requirements to deployment of a DAM tool for that as well.
I’ve just done a variety of things since then. I went back and got my Master of Library and Information Science (MLIS) degree about 12 years ago and then worked in academia for a while doing digitization projects and looking at how to disseminate the digitization projects. So, I had some big projects and small projects. I did everything from the digitization, which is complex because it’s got complex metadata behind it, to analyzing what the unique objects in that particular university’s archives are that are requested a lot that might benefit from being digitized
On your LinkedIn profile, it says you helped archive the University of California, San Francisco student newspaper and create an online homeopathy collection. Why do you think online collections or archives are important?
“I think they’re important because they create a mechanism for anyone to be able to access that collection.”
So the homeopaths collection at UCSF is unique. It has a unique volume from the most famous homeopath, [Samuel] Hahnemann, and it was his final author’s copy of the last edition of what’s called “Hahnemann’s Organon,” and it’s filled with handwritten notes. There’s some dispute that he died before the next edition was published, whether or not that posthumous edition was accurate to what his intention was. So homeopaths from all over the world come and visit that book, which in my mind was huge, but it’s this tiny little book.
It had international value. People didn’t have to come to San Francisco to visit the book anymore. We had a homeopath visit from Germany, and said she had downloaded all of the notes and printed them out, but she still wanted to come and see the physical object.
Plus, it was an interesting challenge because the notes were handwritten, and degrading, and taped in, and the tape was failing; sometimes the notes were folded and two-sided. So it was an interesting challenge for me to figure out how to digitize those and disseminate it. Plus, we didn’t really want a lot of people touching this book because it was becoming pretty fragile.
So I think for archives, the decision is who’s the community that wants access, what’s the demand for that, how fragile is the object? Is there a benefit in digitizing it for the digital record as well as the physical object? So it was a little bit about building the community, making things accessible that might be fragile, and also just highlighting things in your collection.
How did you first find out about DAM systems as a way to deal with digital assets?
Initially, I looked at an early system when I was still in New York in 1992, which I think was called Digital Arts and Sciences. And that was a small room filled with DAT (digital audio tape) decks and kind of what seems like a really archaic computer now.
Then, I worked at a publishing house, Benjamin Cummings Publishing, and we had a digital asset manager there, and that was the first time I’d seen a full deployment. That digital asset manager has been a great mentor to me over the years in terms of understanding process, tools, how people are accessing, and what they need to access. So, I’d say in about 1995 was the first time I started using a digital asset management tool.
I see you obtained a MLIS degree. How do you use your library science skills in your role as a digital asset manager?
I went to library school after already having been a DAM manager for a while because I felt like I had some knowledge gaps and I really wanted to understand more traditional archival processing. I don’t really feel DAM is as library-related as it is archive-related in terms of being able to organize, describe, and disseminate. I think there’s more close alignment with archival processing. So I wanted to gain some experience there, which was really helpful. Probably the most valuable thing I got out of going back to grad school was really gaining an understanding of preservation. And that’s one thing we don’t talk about enough in the DAM community.
We talk a lot about archiving, but archiving in DAM means it just goes to some long-term storage; it’s not active preservation. There’s no guarantee that those files are gonna be readable in 10, 15, 50 years, because they’re just sitting on another file store. And so there’s a lot of work being done in terms of how do we do active preservation, what does that mean, are we creating emulators, are we migrating content, updating files as operating systems and tools that we use to read files like Photoshop are being upgraded, how do we make sure that those files are still readable? So that’s something that I’m really communicating out to the DAM community.
My MLIS degree really helped me expand my knowledge of metadata because I already knew Dublin Core, but there are so many other metadata schemas out there. Are they better, or how do you mix and match, and how are those metadata schemas used? That was really helpful to me, and that was sort of my goal in going back and getting my MLIS. And that’s been really useful to understanding those XML schemas and how they can be used, and how they can be used for the long-term preservation of your assets so that you get more return on your investment.
I see you’ve worked at a few different places where you implemented a DAM system. What were some of the common challenges during implementation?
Often there’s not adequate IT support, and sometimes IT shops will make the decision about the tool based on what they think they can support, not based on what is best for the organization. In every place I’ve worked, it’s come down to, do I have IT support or do I not have IT support? Can I run something on-premise or can I run it in the cloud and take my IT team out of it?
A lot of IT teams are challenged with supporting a lot of different tools, and DAM is very specialized. So my preference is to just ask my IT group to support APIs and not try to also support the tool. Leave that support up to the vendor who has deeper expertise, because once you get a DAM adoption in place in the organization, people want it up 24/7. They’re very demanding. They’re not very patient, and if you then have to wait for two hours of an IT guy’s time who may only kind of know your system, it’s not very efficient and it’s not very cost effective in the end.
So it comes down to creating the right relationship with your IT team and your vendor, and that’s been a commonality. I’ve heard that from a number of colleagues, too. When the IT team gets too involved, invariably either it doesn’t get deployed right or the wrong tool is chosen because they make choices not based on the requirements across the organization, but based on what they can support.
What does the process of convincing an organization that they need a DAM system look like? What’s the journey?
Sure. So I think the journey is identifying what those pain points are across the organization, and most of us have the same set of pain points, so that’s not difficult. There’s a lot of information on Widen’s website or any number of other websites to demonstrate how to prove the return on investment.
“I think the cost savings efficiency factor is a huge sell. There’s a lot of time savings that can be realized once you identify that you’re getting hit up for a lot of the same requests and it’s incredibly time consuming and taking you away from doing new work. That is the biggest sell, I think, on the corporate side, just that efficiency factor.”
Then you quickly prove through stakeholder comments that this tool is great, it saves so much time, we didn’t have to go talk to people. I can do a simple search and find what I was looking for.
The cost, at least the initial startup cost, usually is justified. Sometimes the ongoing cost for the annual licenses becomes a little harder to keep selling, so you have to get in the mindset that it’s not a one-time sell, it’s a constant sell. And I think people don’t realize that. I think from a budget standpoint, higher up the food chain they always see it as a one-time cost with long-term value, but it’s never a one-time cost. And that becomes a little bit of a harder sell that this is a long-term investment. You’re going to have to constantly provide the analytics to demonstrate that is very useful across the organization.
What gets the attention of the highest ranking person at an organization with digital asset management?
I’m laughing, because the first thing that popped in my mind is if they could find images for their slide deck without calling anybody. That seems to demonstrate a lot of value to people higher up the food chain because they’re not really accessing the assets on a regular basis until they do a presentation, right? So how quickly can you get assets together for my presentation?
Well, if I’ve got them in the DAM, then it’s minutes, and then you can go look in the DAM too, and they’re like, ooh, this is cool. It’s usually not until they need something that they find value, and even though that’s sort of a lower-end use case, it’s where the people higher up the food chain are gonna be interacting with the tool. It’s still valuable to them. It’s not where I think the most value is, but it’s still valuable.
What have been some of the lessons learned that stand out that you want others to look out for?
Planning. I think it is really important to do a broad requirements gathering. There’s lots of benefits to doing that. You start to get stakeholder buy-in while you’re understanding how people are gonna use your tool. All of that information informs how you’re going to set up your metadata, how you set up your taxonomy, how you’re going to do training, and how you’re going to do the rollout. And that’s really valuable information.
If you go through those steps, it takes a little bit of time, but the value then becomes much greater and it makes the whole deployment much smoother. Then you’ve got users who are excited about it and ready to use it as soon as it’s ready. And that’s the whole point in having it, right? To get people to use it.