After I first spoke with Caroline some months ago about the marvelous work she was  doing with Irish Deaf Kids (now, my thoughts turned to how technology could assist and to how Irish Tech News could help get that information out there.

It may not be an obvious business sector, but we hope that with the many Application, hardware and web developers that visit our site regularly, the mutual benefits for both may become more clear after reading this article.

We start off with an introduction to Caroline’s early life, the very description of which we hope will open your eyes to the obstacles faced.

Later we explore the impact current technologies are having and hopefully, what we can expect in the future.

Caroline’s early life:

Caroline Carswell is profoundly deaf, verbal and does not use sign language. Her parents confirmed her deafness at age 16 months in the 1970s, with no family history of hearing difficulties.

They researched for courses of action, finding a real lack of support in Ireland. With information from the UK and the US however, they discovered it was possible for a deaf infant to learn to listen and to speak with consistent hearing-device wearing, plus daily speech therapy.

Bravely, Caroline’s parents chose the listening and speaking communication option, with the goal of enrolling Caroline in mainstream school, plus daily speech therapy at preschool and primary school level.

By 18 months old, Caroline wore hearing-devices and her learning-to-talk process began with home-teaching from her family, and daily classes with the speech therapist.

Technology at this stage included an oscilloscope whose display showed speech sounds made by the therapist. Caroline’s task was to ‘match’ the therapist’s patterns on the oscilloscope’s panel by vocalising sound pitches with information received via a mix of lip-reading and hearing with her devices.

Three other deaf pupils attended the same mainstream primary school in their own age-appropriate classes with hearing peers. All went to the same speech teacher in Harcourt Street.

Caroline finished primary school with the support of one hour’s speech class every weekday and a one-hour resource teacher session per week. The daily speech classes ceased at this time and at age eleven, Caroline found that behind-the-ear (BTE) hearing aids gave her access to some birdsong and to new sounds.

At the local high school, her weekly teaching supports were two hours’ extra maths, and a one-hour resource session. Extra-curricular activities as a teen included sports participation at provincial level (hockey, badminton and swimming), plus weekend & school holiday work at a local health-food store.

Caroline took the Leaving Certificate on the same basis as her hearing peers, with the goal of reading History at Trinity College, Dublin, or taking a degree in the UK. One of the first born-deaf verbal children in Ireland to be mainstream-educated in the 1970s, she matriculated for Trinity College Dublin (before an access office existed), and graduated with an honours degree.


Q1 Hi Caroline. Thanks for taking the time to do this with us. Can you tell me when you came up with the idea to start Irish Deaf Kids?

In Ireland, verbal deaf people are not seen on TV or in the media – which increases the public’s fear about meeting people like ourselves (my peer group). The catalyst for setting up IDK in 2007 originated from our sheer frustration of explaining ourselves as verbal deaf people to almost every new person we met at college, in mainstream social groups, when travelling and in new training environments and workplaces – as deaf people with speech, we did not fit the perceptional box that was created for us. Another question concerned us. If the public did not know deaf people can be (and are!) verbal, were families in Ireland with newly detected children, being informed of all their communication and education options? With access to hearing and speech services being a human rights issue, this point was critical to Team IDK. Hence, our goal was to mainstream hearing issues and to make the challenges relevant to social inclusion practices.

Q2 After you completed your degree in Trinity, in what area did you pursue a career?

During my third year at Trinity, I got into Mac-based DTP work with Trinity News, the college newspaper, as a sub-editor and occasional assistant editor in addition to part-time, paid work at the Students Union. Accordingly TCD’s careers guide suggested Publishing as a one-year post graduate diploma course at Oxford Brookes University (UK). From that course, I gained a permanent full time role at Oxford University Press (OUP) after a year of juggling three part-time jobs within OUP’s Science and Journals division.

Two years later, I asked my boss “where is this job going?” and he did not answer. Instead of spending 2-3 more years in the UK, I returned to Ireland (after 5 years in the UK) and researched post-grad courses in Internet systems. The day after I moved home, an advert for a new ESF-funded, DIT course in Internet systems was in the Times. Six months earlier, I had unsuccessfully interviewed for a multimedia course at DIT, but with that credit, the DIT offered a place on the net systems course. That course led to a corporate marketing role with Trintech in Dublin, managing two corporate resource and product marketing websites.

After these two roles (both, over a 15 year period) I had the skills to be confident in running a website of my own (the then site) and curating the content it would need on a recurring basis.

Q3 As a deaf person, what difficulties did you experience in your professional career then and how do you feel it has improved nowadays?

Only after finishing the postgraduate course at Oxford Brookes did I realise the difficulty in securing my first employment role. Publishing is difficult to break into and I had few contacts in the UK – while the recruitment agencies in Oxford made it clear they did not want to deal with a client who was deaf, even when she had a postgraduate diploma. Sound Advice’s YouTube channel has a video with this story. Thankfully a savvy recruiter overheard one particular (discriminatory) exchange in a high street agency and invited me onto her books. Through her, I got a permanent full time role, with OUP as mentioned.

Second time round, my curating and writing skills got me hired. In fact the word power Trintech needed, derived from my time in speech therapy, which focused heavily to reading books aloud for learning and pronunciation. This method gave most of my peers excellent literacy skills, which are leveraged in their current career roles in the public service, technology, finance, marketing and other areas – and which are massively useful in the different roles that running Sound Advice requires.

Today’s improvements in recruiting processes – it is notable that in North America, recruiters and contacts approach me to discuss job opportunities but in Ireland, such conversations are much more difficult to start, potentially because people with disability were traditionally segregated from mainstream society. Diversity can bring new talent to business entities as a response to skills shortages, particularly if flexible work policies are offered to people who need to take medication or regular breaks to self-pace their energy through the working day. In recent years I have not actively participated in the recruitment market but this situation will change as I move into short-term marketing contracts and project collaborations.

On the upside, tech is a big leveller in workplaces and this is one reason IDK rebranded as Sound Advice – for the varied stakeholders to realise the value of the lived experience available from this sole trader.

Q4 You now act as an advisor to government and other state bodies. Can you tell me about what that work involves?

Through Sound Advice (IDK), many hats get worn (founder/financier, business development, marketing, sales, finance and team leader). Requests to give input to seminars and conferences resulted, and the advisory roles grew from there. My lived experience as a deaf person navigating mainstream education and workplaces before social inclusion and access policies existed, is most often sought, as a panelist at events and/or as a contributor to policy formulation in these areas. Occasionally I am contacted to give observations on potential new policy developments in access and/or social inclusion plans, or to advise client contacts on optimising their internal processes to improve service/s for customers and employees.

Having a multidisciplinary background is a critical part of this work, to see different aspects of a business or operational challenge and to be able to recommend solutions to those particular challenges.

Q5 Irish Deaf Kids has recently evolved into “Sound Advice” and your website has been redesigned over the summer. Can you tell our readers about that transformation and also about what your day to day role at Sound Advice now entails?

Seven years after its establishment, the team at IDK believed the venture had met its charitable mission of working to support inclusive education in Ireland while empowering parents to develop their child’s full potential. In that time, newborn hearing tests rolled out in Ireland (2011-13),  the education policy paper for mainstream education was published (2012) while eligible infants and children can access HSE-funded bilateral cochlear implants since July 2014.

In this redefined operating landscape, stakeholder input was needed, After some discussions, team IDK chose to upgrade and rebrand the website in one move, for content to be accessed from mobile and desktop devices.  This way, the best parts of IDK (its website) were retained, while refocusing the site towards technology for inclusion in mainstream education and living.

My role with Sound Advice alters in that the parents are now connected, empowered, and working as activists in the state health and education services systems, with input from here, as needed. Accordingly I am free to explore and to guide on technology use as a leveller for children and students with hearing difficulties. To do this, I connect with employers, tech firms, policy makers and recruiters, many of whom already track IDK-Sound Advice’s online community pages to learn about potential to optimise their products, services and policies for people with hearing issues.

Q6 One of the areas we wanted to concentrate on in this interview was how technology can assist people with hearing difficulties and act as a leveller. Since your early experiences with hearing aids in the 1980’s, how has the technology advanced and what are the key features of modern hearing aids that are really beneficial?

Digital hearing-devices are now a slick consumer technology that pair with smartphones with Bluetooth and wifi links, whereas the 1980s devices were basic and beige with far less sound clarity. Through my teen and young adult years, the access to sound such as birdsong increased every five years with routine upgrades of hearing-aids. The first digital pair in 2004 brought accents to my ears for the first time and required me to rejig my lipreading to filter the new audible detail with the seeing process of lipreading. In fact this was a great start to learning to lipread with the sound clarity of a cochlear implant from 2011.

Like mobile phones, hearing-devices are miniaturising. The smartphone-managed,discreet Halo ‘hearable’ from Starkey has storable GPS settings to auto-cut background noise in loud places like cafes and bars the wearer regularly frequents. Phone, Skype and Facetime calls can be streamed to the Halo device, as can Siri responses and music. By 2018, ‘hearables‘ (smart earbuds) like this are estimated as a USD 5 billion market and being able to discreetly alter hearing-device settings with a phone is a boon, provided smartphones are not banned in public places. One time in a theatre in the US, a woman sitting behind me sharply tapped my shoulder when I changed my implant’s volume with its standalone back-lit remote control. She thought it was a phone, but a smart watch is the real solution for this challenge – and for many more uses when hearing devices connect with portable, smart devices.


Q7 With the prevalence of Smartphone and Tablets these days, having easy access to a visual display must be a great aid?

Visual detail is everything for people who wear hearing-devices (all ages), with smartphone and tablet screens offering a vital, visual interface to updates distributed via social media and other channels as a real alternative to TV or radio and to messages from friends and contacts. News-hounds like Twitter’s near-real time feed, which displaces broadcast media, and allows rapid sharing of visual images via  Facebook and Pinterest, too. Cinema fans with smartphones have the Cinema Subtitles Viewer App (Android/iOS) which customises subtitles on a user’s phone for their personal reading. Smart-ear apps like Otosense and Braci are also emerging, which use smartphones to identify and label sounds in a user’s environment, and send visual alerts on nearby sounds to the smartphone screen.

Children in school and students in college similarly use smartphones and tablets to access visual detail relevant to their learning. This may be (for under-fives) learning to read with images and text, to acquire an understanding of new words and connections, and expand their knowledge of the world around them. Older children use mobile devices to read realtime classroom captions when these are provided (as in Australia) and to store notes transcribed as a summary from classes, by an in-person note-taker. Learning becomes visual, as in a 3D picture of the solar system, or acquiring and understanding a modern foreign language with the benefit of captions for video and/or for classroom dialogue (when these are provided).

Educators using digital (visual) tools are required by emerging, international laws to customise digital material for access and use by students with English as an additional language, with hearing issues, and for general teaching use. Significantly, YouTube’s auto-captions tool is criticised with 80% of people who use video-captions, having no hearing issues – a tech shortcoming that is driving demand for embedded captions in video. Current market leaders in this area include Panopto, Kaltura, 3Play and Amara Subtitles, with podcast transcribing tools including ListNote and Dragon Dictate. Output of this text-based content is visual and most students already have a mobile device in a pocket or bag.

Teens redirect messaging apps like Skype, WhatsApp, Snapchat, Viber, Kik, MeowChat, Yo, Loop and Whisper/Secret for rapid visual contact and picture exchange with text among their friends. In the fast-moving world today’s teens inhabit, messaging apps can be a critical way for a deaf teen to follow the buzz and track the daily dramas in their friends’ lives (if these are shared). Unlike previous years, deaf teens are no longer on the back foot in terms of hearing gossip or news about their peers, but on the flipside, the technology can also be used to undermine individuals, as we are regularly reminded.

Q8 The most obvious leap forward for me has been in voice recognition. It has become so good now I use it everyday. With every update of Android and iOS it improves even further. Can you tell me about the impact this has had for you and what are the best services that are utilising this technology now?

Voice recognition for me, means real-time captions and transcribing of podcast content into text form for initial access and review. Everyone uses voice recognition differently (as we are noting) but real-time captions in education and training environments are the core value for students and employees needing to contantly upskill in today’s competitive jobs market. Realtime caption service providers praise Dragon Dictate, Siri and Google Now as routine tools they use, but a slew of new solutions for education and training is constantly emerging, such as VerbaVoice, devised by a social enterprise in Germany.

Realtime captioning is a lifeline for severely to profoundly deaf people, particularly for sessions longer than 2 hours when the energy required to hear and/or lipread spoken dialogue in an ad hoc environment can quickly drain personal energy. In fact people with hearing difficulties use up to 50% of their daily energy to communicate, versus just 5% for hearing people.

Transcribing tools are a holy grail for people who don’t hear speech sounds clearly at meetings, when learning or upskilling, or who want to get their own, spoken notes into text (regardless of hearing ability). Options for transcribing range from personal note-taking tools like Dragon Dictate, VoiceNote, DictaNote or Online Dictation while people who are non-verbal have options like Announcify, Select and Speak, or Speak It.

Q9 Many well known App developers are regular frequenters of our site. For any of them reading this now, what areas do you feel they could be working on that would have a real impact on the quality of life for deaf people?

Speech recognition, real time voice to text and voice-transcribing tools are the massive, untapped market, currently valued in the US at USD 16 billion per year, with a predicted 21% compound annual growth rate in the educational and corporate sectors. Cloud-computing will drive maturity of this market with one new app being, “I See What You Say”. One Lift driver in San Francisco, the founder of the Digital Army Devices start-up and who’s verbal deaf, asks his passengers to speak into his Android smartphone for him to read the output on a smart watch on his wrist. This app was devised to address his and others’ pain point of needing to lip-read while multi-tasking (driving, in this case).

Think “life with subtitles” when considering potential to redirect transcribing tools for executives and students using a language beyond their native tongue, in meetings and when learning. In a recent meeting, one attendee cited Google Glass in the context of its transcribing from English into his native language, and everyone nodded their head in empathy. Tools like this, with mass market user potential are a very practical and potentially lucrative area to focus research and development (R&D) budgets to.

Founders seeking a big opportunity should also consider tools for redirecting voice-mails into text format. Unified communication platforms have potential, but deaf people can need an actual person to retrieve voice-mails left on their phones, even when their auto-response asks callers to send a SMS text or email instead of leaving a voicemail. Such tools would benefit executives with full hearing, who need to discreetly retrieve voice mails as a meeting starts or concludes. Get in touch with Sound Advice if you have a relevant voicemail tool, or one to redirect, this issue being a particular irritant for the team here.

Q10 Let’s talk about the “Internet of Things”. Having completely interconnected devices across the whole home, I imagine, will also offer many benefits. How do you see this developing and what are the key areas that you think can make a big impact for people with hearing problems?

Having interconnected devices in the home is compelling for people with hearing issues. At the CES 2014 show in Las Vegas, devices including a TV, speakers, a smoke alarm, light bulbs, and a door lock were linked using an open protocol. When the smoke alarm detected smoke, the speakers emitted its signal. In turn, the light bulbs flashed a specific colour to show which home-alarm was active. Meantime, the TV (or screen) in each room displayed evacuation instructions, and the connected door unlocked automatically.

At component level, smartphone-linked light bulbs like those from LG, can be wifi and bluetooth-linked to alert deaf people to sounds and/or alarms that may be active nearby. Pairing these light bulbs with text message and email alerts – and to doorbells, smoke/carbon monoxide, burglar and other alarms will be vital. Maybe LG could team with the makers of the Deafalarm app, to pool their synergies? Similarly, if a night-time attempt was made to steal a car from the driveway, or to break into a connected home, tactile alarms could alert the homeowner to the clandestine activity on their property, maybe with connections to the local police or civil guard force, in urban or suburban areas. Either way, we can anticipate out-of-the-box team-ups as new solutions are devised to address scenarios we don’t even think of today.

Cochlear Implants

Cochlear Implants

Q11 In 2011 you had cochlear implants fitted. Can you talk us through that experience and what that has meant for you?

Adults in Ireland receive one cochlear implant on the HSE, but maybe this provision will change to two, once the social and cost benefits of the new two-implants programme for children are documented. On reading about cochlear implants in 1985, my reaction was, “I’ll do that when I’m grown up!”. Fast forward 20 years and two sets of digital hearing-aids, and a hearing test showed the left ear had lost some sound perception. “Have you ever considered a cochlear implant?” the audiologist asked. For peace of mind, I needed to research this, my only option for rejoining the world of partial sound. How long the surgery and recovery times were, could I still travel and live abroad and what about mountain sports and activities?

Surgery duration for cochlear implants actually dropped from 7 hours in 2004, to 4 hours in 2008 and by 2011 was at around 90 minutes for one ear. That did me. One final confirmation was seeing an audiogram online, from a contact with similar pre-surgery hearing levels to mine. Her post surgery hearing tests showed her to be hard of hearing with her implant, an outcome beyond anyone’s predictions. As a person who likes calculated risks, this was my closest move toward consent for the surgery. Two weeks before surgery however, I got cold feet. What if the device did not work, taking my tiny scrap of residual hearing with it? (after this surgery, little natural hearing remains in the ear). This point bothered me, but as my family pointed out, the potential benefits were far, far greater than the scary risks that were surfacing. My fears faded slightly when the surgeon said in his evening post-surgery visit that all 22 electrodes fired when tested during surgery, Namely, this was positive for when the device was activated a month later.

Switch-on day was underwhelming. Instead of hearing sound as I knew it, everything was beeps, whines and whistles as my brain struggled to decipher this new detail. Walking around the house labelling all the sounds (the fridge, clocks ticking, and later the dog’s tag chinking) helped make sense of this new world. About 3 weeks after switch-on in 2011, I overheard a full sentence without lipreading, to the family’s amazement – this was unthinkable before surgery. With daily listening practice, I can now overhear bits of conversation in the supermarket, in the street, at the post office and at office reception desks. This summer I even overheard a verbal update from a buddy in a noisy beer-garden, which surprised us all,

Cochlear implants aren’t a fix, but for people with my profound deafness, they can be the only option for access to everyday sounds and feeling more connected to the people central to your life. Children who get early implants have a head start in learning verbal language, particularly if they’re from chatty families and get soaked in spoken words – which research proves to be best for essential early-years learning. The irony is when my entrepreneurial route began in 2007, I had no idea whatever that a cochlear implant would be acquired along the way, but in fact it is a massive asset as the next stage of the road unfolds.

Pin It on Pinterest

Share This!

Share this post with your friends.