Natural Language Processing (NLP) Models for Evaluating Success Management in Customer Engagements

Generated with DALL-E 3. “NLP and Customer Success Metrics”

In any manufacturing industry, relying on business software is essential. Consequently, the software can be intricate and demanding. Customer success management focuses on ensuring that customers effectively adopt the software’s business capabilities and are retained as revenue sources. Activities related to customer success—such as monitoring, evaluating, adopting, teaching, and expanding the software’s use—are becoming increasingly important. With a growing array of software options and the ease of switching vendors via cloud-based applications, customer success might be the key factor in guiding customers through a successful software journey.

Research underscores how customer success drives value, with some studies highlighting that value generation often stems from expanding relationships with existing customers. For instance, a 2023 McKinsey & Company article revealed that 80% of total value creation by organizations comes from existing customer expansion, a core principle of customer success (Bough et al., 2023). The research commonly identifies metrics such as expansion and retention as critical indicators of customer success. However, this article explores how Natural Language Processing (NLP) models can offer additional insights into customer success performance, potentially enhancing business outcomes and value delivery through software.

NLP provides a sophisticated, data-driven approach to evaluating customer success. Existing organizations must examine how routine customer success management, i.e, leveraging business processes, system-use telemetry, and performance metrics, can be enhanced with NLP. Assume that traditional customer success KPIs like “retention rate,” “expansion dollars,” or “CSAT scores” might not fully capture the effectiveness of customer success management. Instead, by integrating these KPIs with NLP-driven analysis of interactions, technical proficiency, and system use telemetry, a more comprehensive view of customer success can emerge.

Traditional customer success metrics, such as Net Promoter Score (NPS), Annual Recurring Revenue (ARR), Net Dollar Retention (NDR), and churn, often fall short of providing a complete picture. True success involves delving deeper into customer conversations, notes, and transcripts. Use NLP tools like Topic Modeling to find the frequency of words and ideas in conversations. Or generate Sentiment Analysis to understand the tone and balance in recorded calls or transcripts. Together, these NLP measures augment customer success metrics and can transform how organizations evaluate and achieve meaningful customer outcomes.

The concept of incorporating NLP and machine learning into mixed model analysis is not new. However, recent advancements in methodology and functionality for extracting significant insights from diverse data sources (Li et al., 2024) are noteworthy. The speed with which NLP technologies move is blazingly fast. Advanced Large Language Models (LLMs) like ChatGPT, Copilot, Bert, LLama, etc. are based on the Transformer architecture. Not only can these models contextualize natural language, but they can also interpret and prompt. Use these models as tools for combining your customer success engagement data with the CS metrics.

Current methods of leveraging NLP in customer engagements include:

  • Chatbots and Virtual Assistants: Companies like Amazon, Starbucks, and Netflix use NLP-powered chatbots to assist with inquiries and support, providing timely and accurate responses.
  • Sentiment Analysis: NLP tools analyze customer feedback to gauge sentiment. For example, Delta Air Lines used text analytics to identify and address issues in their in-flight services, leading to improved customer satisfaction. NOTE: Staircase.ai is a pioneer in this space and could be considered for customer success organizations.
  • Automated Response Systems: NLP-driven systems manage customer inquiries efficiently, with banks using chatbots to enhance support processes.
  • Enhanced Service Quality: Integrating NLP AI chatbots into customer service frameworks can boost service quality and efficiency, though careful implementation is essential.
  • Improving Customer Experience (CX): NLP’s ability to understand and process natural language is key to enhancing CX across various industries.

Integrating NLP model-driven analysis with key customer success metrics is crucial. Success in customer engagements depends not only on quantitative measures but also on understanding interactions through conversations, notes, and transcripts. By incorporating NLP-driven insights, organizations can gain a deeper understanding of customer needs and sentiments, improving decision-making and fostering stronger customer relationships. This approach not only enhances customer satisfaction but also supports long-term growth and success.

Imagine a chatbot trained on extensive customer success data—conversations, notes, and transcripts. Such a chatbot would adapt its tone and responses to align with the company’s culture and vision, anticipating needs and providing personalized solutions. This level of customization would enhance the customer experience, reinforce the company’s commitment to success, and ultimately drive growth and loyalty. Leveraging AI in customer engagements goes beyond automation; it’s about personalization, adaptability, and aligning technology with the human touch.

Radio, We Still Love You. Not You, Internet Trolls

Do you own a Radio set?

1930 National Census

The 1930 census included for the first time a question regarding a consumer item. Respondents were asked whether they owned a “Radio set,” a luxury that had become increasingly common in the 1920s.

The radio was the first national mass medium. It linked every listener to news, sports, entertainment, comedians, political talk, and most of all, music. And of course it was free!

Not all nations celebrated the free programming on radio in the same manner as Americans. The 1930’s also brought about a war on truth.

Joseph Goebbels ensured that radios would deliver the Nazi message to Germany. Goebbels delivered the Reich’s messages through the Ministry of Public Enlightenment and Propaganda. In his first speech to reporters about mass media, Minister Goebbels said “One can fire at the opponent with machine guns until he recognizes the superiority of those who have the machine guns. That is the simplest way. One can also transform a nation through a revolution of the spirit, not destroying the opponent, but winning him over… We have not become a minister to be above the people, but rather we are now more than ever the servant of the people.” Goebbels was using radio, television, film, and newspapers to first connect with the people, and second to transform and mold them to the propaganda of the Nazi party.

German Volksempfaenger was used to inculcate the citizens.

Not long after taking the lead role in the Ministry, Goebbels and the Nazi party disbanded all of Germany’s private radio and television stations and took ownership of their equipment to deliver their ‘cleansed’ content. The ministry also sold and distributed a cheap effective radio set called the Volksempfaenger for a mere 75 Reichsmarks. All German citizens afforded one, therefore all German citizens were tuned into to its programming. By the time Chancellor Hitler delivered his Enabling Act speech on the floor of the Reichstag on 23, March 1933, granting him unlimited authoritarian power over Germany, millions of citizens across the country were tuned in to hear their Dictator take control.

Nuremburg Nazi rallies were held from 1932 – 1938.

Goebbels gave Hitler the country by connecting the people to him through radio. With the people behind him, Hitler chartered the country into a decade of destruction. History knows the devastating consequences of Nazi Germany’s desire to rule the world.

Made for T.V.

American soldiers came home from the war. Families moved to the suburbs. Everyone bought a television. The late 1940’s through the 1960’s television programming entertained Americans from the comfort of their homes. Lucy led the way, followed by game shows, westerns, family comedies, crime dramas, live sports, soap operas, and the bane of all television watching for decades to come, commercials.

Televisions connected with people in ways that radio and telegrams could never: it was eye to eye. Viewers could see the faces, expressions, and body language of their presenters. The connections were visceral. People fell in love with the Beatles and Elvis because the rock and roll stars were playing for them, in their house!

The Beatles on the Ed Sullivan Show in 1964.

Television definitely connected people with celebrities, athletes, and game show hosts. It also revealed the horrors and atrocities of war. The Vietnam War was the first televised war. While World War II film footage was highly edited and focused on morally supportive results, Vietnam was broadcast live.

Rolling CBS footage of the Vietnam War.

The war tore America apart. Anti-war protestors saw the images of dead soldiers and marched in the streets. War supporters saw the images and accepted the losses as the price of freedom.

The National Archive claims that the Vietnam War so deeply divided America that its fracture remains to this day.

Radio on the Tube

As television gained prominence in American households, it didn’t leave its audio roots behind. The transition from radio to television was seamless for some radio personalities, who found a new home on the small screen. The late-night talk show format, pioneered by radio legends like Jack Benny and Eddie Cantor, made its way to television with hosts like Johnny Carson and Dick Cavett. These shows became a staple of American television, offering a mix of comedy, interviews, and cultural commentary. The transition from radio to TV not only preserved the essence of these shows but also elevated them to a visual medium, creating a more immersive experience for the audience.

The influence of radio on television was not confined to late-night programming. Daytime TV also witnessed a transformation with the introduction of talk shows that were once radio staples. Shows like “Maury Povich” and “Jerry Springer” brought the sensationalism of radio talk shows to a visual medium, captivating audiences with dramatic confrontations and emotional revelations. This shift from radio to television demonstrated the adaptability of the medium and its ability to evolve while retaining the essence of its predecessors.

The Internet Talkshow Revolution

Fast forward to the digital age, and the talk show format has found a new home on the internet. The rise of blogs, social media, and online personalities has given individuals a platform to host their own talk shows, reaching audiences globally. Podcasts, YouTube channels, and live streams have become the modern equivalent of daytime and nighttime TV talk shows, offering a diverse range of content from interviews to discussions on current events.

Online personalities like Joe Rogan, with his immensely popular “Joe Rogan Experience” podcast, have become the new-age talk show hosts, connecting with audiences on a more personal level. The democratization of content creation on the internet has allowed for a multitude of voices and perspectives to be heard, reflecting the diversity of opinions in today’s globalized world.

In a more recent development, Pat McAfee, known as “The Punter,” has successfully transitioned from his career in American football to become a prodigious internet talk show host. As a personal sidenote, I had the privilege of residing in Morgantown, WV during Pat’s time as a punter for the Mountaineers, witnessing his live performances on multiple occasions. Our paths even crossed at a house party hosted by my downstairs neighbors, where we shared a few beers.

The Pat McAfee Show, boasting over 2 million subscribers, is not only a testament to his success but also exemplifies the rapid growth of internet talk show influence. Notably, the show is simulcast on ESPN, underscoring its broad and expanding impact in the digital space.

Give rise to the social media accounts of Super Influencers, stars of the field and stage who parlayed their careers into online personas that help everyday citizens feel connected to them.

Welcome Nikkie de Jager!

Nikkie De Jager, widely known as NikkieTutorials, is a Dutch makeup artist, beauty vlogger, and YouTube sensation who has left an indelible mark on the beauty and cosmetics industry. Born on March 2, 1994, in Wageningen, Netherlands, Nikkie gained international recognition for her exceptional makeup skills, creative artistry, and engaging online presence. Her journey into the beauty world began at a young age, and she honed her talents through self-expression and experimentation.

NikkieTutorials rose to prominence on YouTube, where she started her channel in 2008. Her makeup tutorials quickly garnered attention for their precision, uniqueness, and vibrant energy. Nikkie became particularly celebrated for her transformative power, showcasing the artistry of makeup and its ability to empower individuals to express their true selves. Her openness about personal experiences, combined with her skilled application techniques, resonated with millions of viewers globally, turning her into a beloved figure in the beauty community. Her 20M Instagram followers show her influence!

One of Nikkie’s career-defining moments occurred in 2020 when she courageously came out as transgender in a heartfelt video titled “I’m Coming Out.” The video not only demonstrated her authenticity but also inspired countless individuals struggling with their identities. Nikkie’s openness about her journey fostered a sense of connection and community, reinforcing her role as an influential advocate for self-acceptance and inclusivity in the beauty world.

Beyond her YouTube success, NikkieTutorials has collaborated with major beauty brands, launched her own makeup collaborations, and has been recognized with numerous awards for her contributions to the beauty industry. With her charismatic personality, remarkable skills, and commitment to authenticity, Nikkie De Jager continues to be a driving force in reshaping beauty standards and inspiring a diverse and inclusive beauty landscape. As of 2022, her YouTube channel has 22M followers!

Golazo Ronaldo!

https://www.instagram.com/cristiano/

Cristiano Ronaldo, also known as Ronaldo, CR7, and affectionately called “The GOAT” (Greatest of All Time), has not only dominated the football scene but has also become a social media sensation. With an impressive track record as the highest-paid athlete according to Forbes in 2016-17 and 2023, it’s no surprise that he holds the top spot in the social media realm.

In 2021, Ronaldo celebrated crossing the 500 million followers milestone across various social platforms, and today, that number has grown to an estimated 3/4 of a billion followers. Over the course of his illustrious career with football powerhouses Manchester United, Real Madrid, and Juventus, Ronaldo’s star has shone brightly for three decades.

While fans initially flocked to Ronaldo’s social media accounts to witness his remarkable technique and jaw-dropping goals, there has been a shift in recent times. Beyond his professional prowess, followers are increasingly interested in glimpses of Ronaldo’s personal life. This has become a focal point where CR7 attracts the most attention.

Despite boasting half a billion followers on Instagram alone, it was the 2017 nude photo shoot and associated pictures that garnered widespread attention beyond the usual football circles. Ronaldo, along with his football clubs, has capitalized on this broader appeal, turning non-football viewership into a lucrative source of income by raking in millions through non-sport-related advertising.

Shoot it Dr!

Meanwhile, in the realm of gaming and online entertainment, another figure captivating the digital audience is none other than Twitch superstar Dr. DisRespect, also known as Guy Beahm. With his iconic mullet wig, tactical gear, and unmistakable persona, Dr. DisRespect has carved out a unique space in the gaming community. Known for his energetic and entertaining live streams, he has amassed a massive following on Twitch.

Dr. DisRespect’s rise to fame accelerated as he brought a theatrical flair to his gaming sessions, combining skillful gameplay with humor and showmanship. His commitment to maintaining his character both on and off the screen has contributed to the creation of a brand that extends beyond the gaming world.

As one of the most popular streamers on Twitch, Dr. DisRespect has cultivated a dedicated fan base that eagerly anticipates his streams and engages with his content. Beyond the gaming sphere, he has ventured into various collaborations, sponsorships, and even explored creating original content.

Much like Cristiano Ronaldo, Dr. DisRespect exemplifies the expanding influence of digital personalities and the ability to captivate audiences across diverse interests. Both figures, each in their respective fields, showcase the evolving landscape where traditional and digital media converge to create a global phenomenon of entertainment and engagement.

The Dark Side of Connectivity

In the realm of influential personalities, each Super Influencer—Nikkie De Jager, Cristiano Ronaldo, and Dr. DisRespect—wields a unique impact on their respective audiences. While Ronaldo epitomizes a polished image on and off the football field, NikkieTutorials embraces authenticity in the beauty world, and Dr. DisRespect navigates the gaming sphere with a blend of dark humor and intense personality. However, the consequences of their influence extend beyond their individual realms.

The polished image that Ronaldo maintains has made him a global icon, setting standards of success and professionalism. Similarly, Nikkie’s journey as a transgender woman and her commitment to authenticity challenge traditional beauty norms, encouraging self-expression and acceptance. Meanwhile, Dr. DisRespect’s online persona, characterized by intense rants and dark humor, raises questions about the potential harm of such behavior, especially in the context of internet trolls.

As these Super Influencers shape perceptions and behaviors, viewers, gamers, and fans face the choice of either rejecting or embracing the influence. Ronaldo’s followers may strive for success and professionalism, while Nikkie’s audience might find empowerment in embracing their true selves. However, the intense and potentially harmful aspects of Dr. DisRespect’s persona raise concerns about whether some may be influenced to engage in absurd or violent actions. The consequences of embracing or rejecting these influential behaviors highlight the broader impact that Super Influencers have on shaping societal norms and individual choices.

In the vast landscape of cyberspace, internet trolls have emerged as a disruptive force, spreading hate, misinformation, and toxicity. Unlike the carefully curated and regulated content of traditional media, the internet allows individuals to anonymously unleash vitriol without accountability. This dark side of connectivity stands in stark contrast to the genuine connections fostered by radio, television, and early internet communities.

2016 – The Lost Internet

The Internet is lost to a culture of hate.

Once it was a geek with lofty ideals about the free flow of information. Now psychologists call this the online disinhibition effect, in which factors like anonymity, invisibility, a lack of authority and not communicating in real time strip away the mores society spent millenia building.

Joel Stein August 2016

In 2016, Joel Stein penned an insightful article for Time magazine, shedding light on the pervasive issue of internet trolls and the disturbing culture of hate that was taking hold on the internet. Stein’s observations provide a broader context for understanding the concerns raised about influencers like Dr. DisRespect and the potential impact of their online personas.

In his article, Stein delves into the disturbing trend of internet trolling, where individuals engage in online harassment and provocation to incite reactions from others. The culture of hate he describes reflects the darker side of online interactions, where anonymity often emboldens individuals to express extreme views or engage in harmful behaviors without facing real-world consequences. The article underscores the challenges posed by this phenomenon, affecting not only individuals but also the overall tone and atmosphere of online spaces.

When examining influencers like Dr. DisRespect, whose intense rants and dark humor have garnered significant attention, it becomes crucial to consider how their online presence may contribute to or counteract the prevailing culture of internet trolls. Stein’s exploration of the loss of the internet to hate culture adds depth to the discussion, prompting us to reflect on the responsibility influencers bear in shaping online environments and the potential consequences of perpetuating or challenging the status quo. As we navigate the influence of Super Influencers, understanding the broader context of internet culture, as explored by Stein, becomes essential in addressing the multifaceted impact of online personas.

Whitney Phillips, author of “This Is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture,” delves into the intricate history and culture of online trolling, offering valuable insights into its evolution and its complex relationship with mainstream culture. Her work, often referred to as “A Brief History of Trolls,” becomes a pivotal addition to the Internet Trolls section of this paper.

Phillips thoroughly explores the origins of online trolling, tracing its roots back to the early days of online forums. By doing so, she illuminates the deep integration of trolling behavior into internet culture, shedding light on its historical development and evolution. This historical context, as outlined by Phillips, serves as a valuable complement to Joel Stein’s observations regarding the emergence of a culture of hate on the internet.

As we examine the influence of Super Influencers like Dr. DisRespect and navigate the consequences of their online personas, Phillips’ work becomes integral in understanding the broader dynamics of trolling behavior. Her insights provide a nuanced perspective on how trolling has shaped and been shaped by societal dynamics, emphasizing the intricate relationship between online subcultures and the broader cultural landscape. Incorporating Phillips’ research enriches the discussion, offering a comprehensive understanding of the multifaceted nature of internet trolling and its implications for online culture.

From Hate to Lies

The early days of internet trolling were characterized by provocative and disruptive behavior within online communities. However, this behavior has transformed into a more sinister manifestation—contributing to the spread of online misinformation. Trolls, once known for stirring controversy for their amusement, have weaponized their tactics to manipulate information, creating a significant challenge for online discourse.

This evolution is particularly evident in the current landscape of social media platforms. Individuals with trolling tendencies exploit the algorithms and the rapid dissemination capabilities of these platforms to amplify misleading narratives. The line between online trolling and deliberate disinformation has become increasingly blurred, raising concerns about the erosion of trust and the potential real-world consequences of misinformation.

To what extent can provocative online behavior contribute to the spread of misinformation, and how does this impact the broader online culture?

Since 2016, misinformation campaigns have driven a deepening divide between Americans. Take the Presidential election cycle when a deceitful conspiracy theory took hold and consequentially influenced voters. The pizzagate consipracy theory went viral on social media. Farmed by trolls and spread like wildfire, it alleged human trafficking and pedophilia. Without context and supporting evidence, readers latched on to the sensational un-news and redistributed the links and messages. https://en.wikipedia.org/wiki/Pizzagate_conspiracy_theory

2016 Pizzagate Target

In 2017, it was widely reported that the Russians interfered with democratic elections. Multiple independent investigations discovered that Russia was not only impacting Americans, but Germans and British citizens too. https://www.theguardian.com/world/2017/jan/09/germany-investigating-spread-fake-news-online-russia-election

The misinformation was intended to destabilize governments. And it succeeded. From 2016 – 2023, multiple neo-fascist groups emerged with fervor in the United States, Brazil, South Africa, and France. Even India experienced a rise of the populist far-right ideology. At its core, fascism intends to rule with racial supremacy using an authoritarian and anti-immigration position. It opposes social democracies and to an extent capitalism.

When trolls lean into self promotion, fascist ideologies, and misinformation while benefiting from the mass media available to them, uncertainties arise.

The Enduring Love for Radio

Despite the challenges posed by internet trolls and the ever-evolving landscape of media, radio remains a resilient and cherished medium. Its ability to create genuine connections, share stories, and provide a platform for diverse voices continues to resonate with audiences worldwide. While the internet may have changed the way we consume information and connect with others, the enduring love for radio stands as a testament to the timeless power of authentic communication.

In conclusion, as we reflect on the journey from radio waves to internet streams, it’s essential to recognize the transformative power of media in shaping our connections and influencing societal dynamics. While the internet has brought unprecedented connectivity, it also brings the responsibility to navigate through the noise and uphold the values of genuine communication that have been the heart of mediums like radio for decades.

THIS ARTICLE IS A CONTINUATION: https://birdsbytes.com/2020/01/31/immediate-interruptions-semaphore-to-baseball-score

From Stage Fright to Excite

Daily writing prompt
Have you ever performed on stage or given a speech?

Stepping onto a stage, whether it’s to perform in a play, act in a commercial, or deliver a speech at a business conference, has always been an exhilarating experience for me. Having studied acting in college and being part of several plays, I discovered that the thrill of live performance is unlike anything else. The energy coursing through the audience, the anticipation in the air – it’s a unique and addictive feeling that keeps me coming back.

My time in Los Angeles, participating in commercials and short films, added an extra layer to this excitement. The fast-paced, dynamic environment of the entertainment industry made every moment on set an adventure. The challenge of embodying different characters and conveying emotions authentically was both fun and rewarding. The element of the unknown, the slight nerves before the director calls “action,” all contribute to a sense of being alive in the moment.

Transitioning from acting to public speaking at business conferences might seem like a leap, but the core motivation remains the same. The thrill of engaging with an audience, the joy of sharing insights, and even the slight nervousness before stepping onto the stage are constants. Whether I am playing a character or presenting business ideas, the stage becomes a canvas for self-expression and connection. The blend of excitement, fun, and just a touch of nerves makes each performance a unique and memorable experience, reminding me why I continue to embrace the stage with enthusiasm.

I recently gave a talk at a major conference in Boston. Two actually. The first went off without a hitch. My co-presenter and I delivered the topic as if it were our job, which in fact it was. But the second presentation was rough. Many of my co-workers and managers were in the audience. During my section of the talk, I lost my train of thought, developed dry-mouth syndrome, and started seeing double. Consciously or not, I summoned the internal wisdom to mentally step back and pause. And here is the lesson – when acting on a stage or set, presenting at a conference, or speaking in front of co-workers and strangers, do not let go of yourself. Keep a small but fiercely dependent presence in your mind so you remain grounded.

Good Luck and “Break a Leg!”

Big Data Analytics for Master Production Scheduling

The very core of America’s automotive manufacturing dominance (Thomas, 2023) extends beyond the Big 3 OEMs (Ford, GM, and Stellantis), it is also found in the chain of Tier Automotive suppliers. Of the top 100 auto parts supply manufacturers in the United States, the first 20 collectively generate >$108B (Tenneco, 2021) in revenue. Across the country, over 2 million people are employed in auto manufacturing jobs, more than 80% work in supply manufacturing alone, (Alliance for Automotive Innovation, 2022) earning approximately $135B in payroll compensation. Needless to say, automotive suppliers manufacture the parts that drive the industry. The profits are costly, however, as major shifts in the industry are combining to hinder growth. Supply chain challenges, production changes from the OEMs, labor shortages, and external geopolitical dynamics are driving up suppliers’ costs. Bolt on the rise of the Electric Vehicle (EV) and industry adaptations for the suppliers are critical for future growth. Many suppliers are expanding their product portfolio, differentiating between Internal Combustion Engines (ICE) and EV, and increasing the products they sell to OEMs (Tominaga, et al., 2023). Others, ostensibly, can reassess their current state of business operations, relying on better analytics to trim the costs from already thin earnings. One of the strategies for big data analysis is utilizing Advanced Master Production Scheduling.

The following paper discusses how a tier automotive supplier might use existing operational, labor, and production related manufacturing data to build sophisticated machine learning models for predicting more accurate Master Production Schedules. The problem with master scheduling is due to the variabilities of the end-to-end supply chain and the randomness of workloads (Dormer Gunther Gujjula, 2013). The paper and associated data models will attempt to overcome the difficulties with scheduling by examining and learning from years of information, developing a method to plan labor, production, and equipment schedules that work congruently, giving suppliers clearer forecasts for the future. In choosing this effort, I am exploring a component of my business with the intent of piloting a process for helping my tier automotive manufacturing customers.

Project Structure

            In the Automotive Industry the Master Production Scheduling (MPS) is one of the primary business drivers. Other schedules and calendars are utilized, most offer useful forecasts for immediate results. A complimentary schedule called Production Planning is like MPS in that both provide time-bound supply and demand requirements. The biggest difference in the two schedules is that MPS specifies what needs to be produced, how much product needs manufactured and when, while the Production Planning schedule specifies how much material is needed to meet the MPS demand. Production planning occurs on a smaller scale too, usually on scales of days and weeks. MPS is broader, often looking months and years ahead. Building a useful long-term forecast requires multiple inputs and necessitates sufficient historical data.

            The variability of inputs also complicates the process. An MPS is influenced by supply and demand, product complexity, variations in production processes, accuracy of quality control plans, labor resource constraints, global supply of raw materials and component parts, and a streamlined operations team.

            Given that long-range forecasts are combined with the variability of significantly dynamic inputs, a strong MPS is difficult to make. This project focuses on a subset of the overall MPS using a combination of machine sensor data, production results, and fault-based analytics churned through advanced ML techniques to produce predictive outcomes. The ML techniques employed in the paper are presented not to derive a robust model that delivers an MPS methodology, but rather to demonstrate some of the capabilities that ML can provide. A handful of the techniques utilized in this paper include the following:

  1. Multi-class classification with TensorFlow and scikit-learn.
  2. Ensemble prediction with Random Forests.
  3. Deep learning models with Neural Networks.

The outcomes of these techniques only represent a basic example and further experimentation is required. In the following sections, I explain the questions to be answered by this study.

Technology Description

Tier Automotive customers generally utilize Enterprise Resource Planning (ERP) systems to manage data across various business units and departments. According to the Panorama Consulting Group (2020), 33.66% of global ERP systems use is in manufacturing (p.4). The advent of cloud-based ERP solutions makes it simpler for smaller automotive suppliers to implement and deploy them. Nearly 63% of ERP consumers selected them due to the low maintenance needs of Software as a Service (SaaS) systems (p. 17). Furthermore, given the cohesive nature of ERP systems, the data for the study will be confined to the schema and format of the inclusive databases and tables.  

The following list of data sets and types is used in the project. Below is a simple table outlining the sources, how large the data sets are, and what data types exist in them. Additionally, notes are available to assist in data understanding.

Data SetsSizeTypesNotes
Production CapacityMed (1600 Records)Str / Int / DecLabor to Demand
Production HistoryXLrg (>4000 Records)Str / Int / Dec / LongHistorical production
OEEMed (1600 Records)Int / DecPerformance Efficiency
Equipment FailuresMed (1600 Records)Str / Int / DecMachine Downtime
Raw Machine DataXLrg (>4000 Records)Str / DecRaw Sensor Data

Data Set Descriptions

Production Capacity –

            Each manufacturing facility sets a finite production capacity. The value of the capacity is the maximum production output the totality of equipment, labor, and resources can achieve.

Sample:

Day NumberProduction (hrs)Capacity (hrs)% Availability
120.252484.38%
219.752482.29%
3212487.50%
48.752436.46%
517.252471.88%
623.252496.88%
722.752494.79%

Production History –

            Manufacturing production history tallies the past records of produced parts. The production history data set is valuable for building a comprehensive time series population of records that reveal a manufacturer’s capabilities.
Sample:

Day NumberPieces ProducedPieces Scrapped% Quality Product
119850183091.56%
21916079096.04%
31890087095.60%
48710115088.34%
51811074096.07%
62511090096.54%
72048051097.57%

OEE (Overall Equipment Efficiency) –

            OEE is an aggregation of capacity, historical information, expected output, and represents the efficiency of the equipment which runs the shop floor. Factors like availability, performance, and quality are considered. OEE can be summarized as A*P*Q. Availability * Performance * Quality. (2023 Vorne Industries). The dataset used for the Regression studies compares Availability to the overall recorded OEE.

Sample:

AvailabilityOEE
0.8437575.73%
0.82291666776.67%
0.87575.28%
0.36458333332.06%
0.7187572.50%
0. 96875101.00%
0.94791666783.26%

Equipment Failures –

            The Equipment Failures dataset includes information about the downtime of machines. There are categories of downtimes, some planned, others unplanned. The dataset holds the records needed to identify when, how, and perhaps what actions were taken to address the downtime.

Sample:

Machine_IDStart_DateStart_TimeEnd_TimeTotal_DownReason_CodePlanned
M_00171/3/201907:11:2201/03/2019 07:46:4000:35:18Die Change1
M_00161/4/201910:44:0501/04/2019 11:17:0800:33:03Off1
M_00161/6/201901:09:4701/06/2019 02:47:4401:37:57Die Change1
M_00191/8/201907:18:5601/08/2019 07:51:0100:32:05Planned Maintenance1
M_00021/12/201921:52:1501/12/2019 22:25:5100:33:36Product Error0
M_00171/13/201901:49:4801/13/2019 02:34:3500:44:47Line Stop0

Raw Machine Data –

            The raw machine data provides a population of sensor outputs for multiple pieces of equipment. The information is studied with ML to cluster and categorize potential correlations. It is good data to foster awareness of how manufacturers struggle with big and puzzling data sets.

Sample:

Machine_IDFailureAVG.Sensor_1Sensor_2Sensor_3Sensor_4
M_000106.93475313.491619.31809810.896098.401777
M_000205.506416.35228412.186267.7576455.330642
M_000305.6891352.9820993.5224459.9475123.76125
M_000405.46793710.98812.6698715.5895396.975189
M_000505.50854913.181741.1793025.4854435.973635
M_000605.6336911.996356.0317083.97960214.74294
M_000714.5100026.9844216.2569246.6886690.43979
M_000815.0643957.81320710.8858812.192779.667473

Platform Technologies

The platform technologies for analysis include the following:

TechnologiesSizeNotes
Plex ERPXLrgERP, MES, QMS
SQLXLrgSql db, tbl, SQL scripts
SSMSN/AManaging SQL data
ODBCN/AConnector to ERP
APIN/AConnector to ERP
Apache SparkN/APlatform for ML
DatabricksN/APlatform for ML

            Plex ERP, a cloud-based enterprise resource planning system tailored for manufacturing operations, excels in providing comprehensive integration throughout manufacturing processes. From order management to production scheduling, inventory control, and quality management, Plex ERP streamlines operations with real-time insights and process automation. Its suitability for managing intricate manufacturing data in the automotive industry makes it an ideal choice. For this study, data was anonymously extracted from Plex ERP.

Fundamental tools for efficient database management and querying, SQL and SSMS play essential roles in the project. SQL, a standard language for managing relational databases, handles tasks like data retrieval, manipulation, and schema modification. Meanwhile, SSMS serves as a robust integrated environment for managing SQL Server databases. These tools, chosen for their capabilities in data storage, retrieval, and analysis, were employed to store the extracted Plex ERP data.

In the realm of data connectivity, ODBC serves as a standard interface for connecting with databases, while APIs facilitate communication and data sharing among different software applications. Together, ODBC and API are crucial components of the project, enabling seamless communication between diverse data sources and applications. ODBC played a key role in establishing a connection between the SSMS database and tables and Plex ERP. Plex ERP’s multiple libraries of APIs, facilitating HTTPS restful management of various data sources, further enhanced data connectivity.

Apache Spark, an open-source data processing engine, paired with the Scala programming language, offers a fast and versatile cluster-computing framework for big data processing. These tools are instrumental in enabling large-scale data processing, analysis, and transformation, aligning with the project’s requirements.

Databricks, an extension of the Apache Spark founders, stands as a commercialized web-based platform that leverages Spark. It efficiently manages processing clusters and facilitates ML pipeline development in an integrated environment. This platform, chosen for its cohesion and compatibility with Apache Spark, contributes to the project’s success in data processing and machine learning.

ML Modeling Process

            Each of the previously identified datasets are flush with possibilities. Assessing the potential needs of Tier Automotive Suppliers based on the datasets is an enjoyable and important component of the modeling process. When examining the questions and hypothetical outcomes, this paper focuses on the following scope:

  1. How can historical operational, financial, and production-related data be leveraged to improve the accuracy of Master Production Schedules?
  2. What machine learning algorithms are most effective in predicting and managing variabilities in the Tier Automotive manufacturing facilities?
  3. What predictive models can assist in the identification of line stoppages, labor shortages, or equipment failures?

I call on the emphasis of Richard McElreath to “start with the causes of the data” (2023 McElreath). The scope of these questions enforces the use Directed Acyclic Graphs (DAGs) and are presented as a causal model to start the study here.

            Start with Production History and Operational Records. Each of these populations of information directly affect the Manufacturing Capacity and Production Schedules. Production history dictates what is possible, operational records define what is negotiable. Buckle in the Equipment and Labor Availability and now Manufacturing Capacity becomes heavily influenced by the additional sources. For example, if labor is short, manufacturing capacity is short. If equipment is down, capacity is down. These two in combination with Manufacturing Capacity impact the Production Quality. Quality suffers or flourishes when the three primary drivers are congruently working in parallel. Production Quality in turn determines the Production History while Equipment Availability and Labor Availability influence the operational records where financial details are considered (Labor and Overhead). By understanding the DAG within the scope of my questions, I can pursue Machine Learning models that are useful.

To Continue… please read & download the report here:

Immediate Interruptions – Semaphore to Baseball Score

Marconi's Radio

The other day I shouted across the office to a co-worker. I asked them about a production order and its status on the shop floor. They shouted back an answer. It was a super quick way to get and receive information. Yet as we all know, sometimes the spoken word does not suffice. Whether it be distance, the din of war, or even language barriers, immediate communication and the influence of messaging dramatically evolved over the past 150 years. Let’s take a look back at the core tools used for instant (relatively speaking) communication.

Télégraphie Chappe

The French optical telegraph emerged in the late 18th Century. Invented by Claude Chappe, the system utilized a country-wide network of relay towers. Each tower consisted of a mast and mechanical arms. An operator moved the arms into standardized positions. Each position represented a text or code. A sequence of codes or positions generated a full message. The source message originated on one tower and the nearest towers would then relay the exact messages to the next towers in their lines of sight. The sequence was repeated from tower to tower until the message arrived at its destination. The network was massive with over 500 stations covering the main trade routes of the country.

During this time, the French possessed an unparalleled advantage over their adversaries. Napoléon Bonaparte leveraged the technology to take over most of Continental Europe in the early part of the 19th Century. The semaphore messages relayed from Paris to Strasbourg, a distance of 445km, took 4 days on horseback, after the installation of the network, messages were delivered in 2 hours. Using the communication system, his armies knew the whereabouts of their opponents, transferred battlefield information across country, and called on reinforcements when needed. Their opponents were left standing around wondering how the French knew so much. The system was huge and far reaching, and not unlike Napoléon’s bombastic decisions to invade Russia and England spreading his armies thin, the optical telegraph met its demise quickly. The electric telegraph replaced it within the same generation.

Four Score and Fifty Thousand Miles

When the electric telegraph was first introduced to the United States by Samuel Morse, several European inventors were already experimenting with electric telegraphing in England and France. The European models required multiple wires. Morse developed a single line transmission system as well as a language to go with it. Morse’s invention was successfully utilized in the Washington D.C. area. His installation along the Baltimore & Ohio Railway connected two buildings 38 miles apart. The first Morse telegraph was sent and received in May, 1844. Between 1844 and 1860, a handful of companies installed over fifty thousand miles of telegraph line, over ten thousand telegraph operators and fourteen hundred telegraph stations across the North American Continent.

In 1861 the Civil War broke out.

Almost immediately, Abraham Lincoln established the U.S. Military Telegraph Corps. The civilian corps functioned to support the battlefield commanders of the Union Army. And during the war, “the President spent more time in the War Department telegraph office than any other place” (Lincoln in the Telegraph Office. David Homer Bates. 1907. The Confederacy did not focus their attention on establishing telegraph lines. Instead, they focused on tearing down the Union’s telegraph wires.

While the Union faced early setbacks at Manassas and Bull Run, the fearless support from the Telegraph Corps ensured that open lines of communication were always available to Washington. Over a short time, the Corps also established cyphers and codes to confuse and challenge the South. Being able to communicate with brief but succinct telegraphs across the vast reaches of the Union and then receive quick replies coupled with the South’s latency in setting up their own communications infrastructure gave Lincoln a substantial advantage in the Civil War. There is enough history to prove that the decision to form the Telegraph Corps probably won the war for the Union four years prior to the burning of Atlanta.

The telegraph remained a vital instrument of instant communication for decades. Western Union and AT&T made billions sending telegrams to family, friends, businesses, and loved ones from the 1890’s to the 1980’s. Allegedly, you can still send a telegram for about $10.

Titanic to Carpathia: We Require Immediate Assistance

1912, April 14, 10:41pm (EST) – The RMS Titanic radiotelegraphed for help. Four hours later, the unsinkable sank. 1,500 souls perished. And the Carpathia, arriving at 2:00am, rescued the remaining 770 survivors. But this is not a story about the Titanic. James Cameron nailed that. This is a story about Marconi’s Radio.

The Marconi Radio was the first wireless telegraph or radiotelegraph. Short bursts of Morse Code were transmitted through air instead of across wires. Used on the Titanic, the Marconi Radio transmitted wave after wave of distress signals to all who were listening. So new was the technology that its founder, Gugliemo Marconi had not yet opened a manufacturing facility for his radios. The Carpathia, and many other ships, received all of the radiotelegraphs transmitted by the Titanic, but most failed or could not respond to the most crucial message of all.

The neglect of responding forced International investigations and policies. Crew members testified about the radiotelegraphs sent to and from the Titanic. A BBC documentary was put together for the centennial of the disaster, capturing the decoded messages, reading them in English. Most of the radiotelegraphs are cordial but the distress calls are fascinating and haunting. Would a faster reaction from the Californian or Carpathia saved more lives? We may never know. What is known from the Titanic wreck is that technology enabled information to be transmitted anywhere, anytime for anyone listening.

Marconi’s Radio transmitted Morse Code wirelessly for just a few decades, most notably it was used on the Titanic.

Come Here, I Want to Talk to You

A single copper wire connected New York to San Francisco. In New York, Alexander Graham Bell sent his voice into the carrier. In San Francisco, an analog signal that was shot across the continent arrived to the receiver of Thomas Watson. There was not much to hear, mostly muffled buzzing and hums. Yet in 1915 it was the first transcontinental phone call.

Amplitude Modulation (AM) wireless transmission arrived simultaneously. It would be another 20 years before wired telephone calls were clear. And during those two decades, AM radio was the Bee’s Knees.

Wirelessly Connected to the Pinstripes

KDKA AM Radio in Pittsburgh, Pennsylvania broadcast a live baseball game between the hometown Pirates vs. their in-state rivals Philadelphia Phillies in 1921. It was this game which brought the ball park to the listener. “Baseball on the radio is language first and foremost.” Crack of the Bat: A History of Baseball on the Radio. James Walker, 2015. This language of the announcers, the fans in the stadium, the sound of the pitched ball slapping the catcher’s mitt, or the ball off the barrel instantly connected fans to their heroes. Close your eyes and you are already there.

Radios were cheap, baseball was slow and narrated. The two were a perfect match.

END OF PART ONE

NEXT UP in this two part series -> Radio, We Still Love You. Not You, Internet Trolls.

I believe in total transparency, very little of what I write is new or inciteful. The purpose is to show how rapidly technology changed during the last 200 hundred years – so much change occurred that the World now relies exclusively on data transfer to make it through everything, everyday. READ ON.

Committed to Memory

“When I was younger I could remember anything, whether it happened or not; but I am getting old, and soon I shall remember only the latter.”

Mark Twain, 1907

I attended a Dale Carnegie Skills for Success preview class last week. Very early into the session, the students were asked to commit a random sequence of objects to memory. The objects were arranged in an odd sequence. The sequence went something like this: Imagine Delicate Dinnerware with a No. 2 Pencil rising up from the center of the plate where a Jersey Cow balanced on the eraser, on the back of the cow sat Curious George (the monkey) with a small cut on his forehead. In George’s hand is a massive bag of ice. Perched on top of the ice bag is Marilyn Monroe who is holding a giant cruise ship. The ship is painted with an enormous Blue S on its bow. At the back of the cruise ship are fresh, new hams that are all wrapped in sheet music, the lyrics to Carry Me Back to Ole Virginny written on them. Sticking out of the hams is an upside down Empire State Building. Resting on the skyscraper’s base is another cruise ship, this time an enormous Red N is painted on its bow. Lastly, a pair of Island Red Hens dance at the stern.

The session only required 20 minutes for us to commit these objects to memory. As you can see, I have not forgotten them!

We all laughed at the absurdity of memorizing such an assortment of unrelated objects. But the last laugh of the memory was going to be on us…

Gustav’s Setting Device for Calculating Machines… and the Like!

In 1929, Austrian super genius Gustav Tauschek invented the world’s first electronic storage device. Yes, Charles Babbage invented the analytical engine with storage by punch cards, but Babbage’s device was not digital. The patent granted to Tauschek for his “Setting Device for Calculating Machines and the Like” in 1932 featured only a simple drawing of the device. The device utilized a rotating metal drum which was coated with a ferromagnetic surface. The surface was imprinted by a series of stationary write heads. Another series of stationary read heads recognized the patterns imprinted on the drum. The patterns were no more than simple particle orientations, formed by altering the ferromagnetic material at that particular location. The heads read the magnetic orientations as 1 or 0. Tauschek’s first drum stored 62.6KB of data.

Drum storage was utilized from the 1940’s – 1980’s

Microchips, Processors, and RAM

Robert Noyce invented the first integrated circuit in the late 1950’s. Integrated circuits replaced discrete transistors and are ubiquitous today. IC’s are found in EVERY electronic device manufactured. Noyce later founded the most formidable chip manufacturer in history – Intel. Little information is required to capture Intel’s impact on the global computer industry. He as recently been named the “Thomas Edison & Henry Ford” of modern computing. Yet only recently are historians aggregating Noyce’s contributions to the world.

‘Bob Noyce cannot be forgotten‘ as the biography, “The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley” by Leslie Berlin strongly articulates.

Watch Leslie Berlin’s presentation to Google about Bob Noyce

Noyce’s inventions and companies drove the industry of microchips, RAM, and storage. His memory will certainly endure permanently.

Hard Disk Drives

IBM invented the modern Hard Disk Drives (HDD) in the late 1950’s. These drives captured and stored data on rotating magnetic drums and registers. Write heads performed the particle manipulations as the disks rotated underneath; Read heads interpreted the orientations of the particles, processing them in a logic registry.

HDD’s are still in use today with magnitudes more memory capabilities than IBM’s 350. Recent innovations relative to optical storage (binary strings encoded on composites and substrates) and Solid State Drives are quickly rendering HDDs a thing to remember.

Running out of Space?

A recent study by Yale suggests that the rare Earth metals used in manufacturing drives, chips, and processors may be drying up. A subsequent scientific journal digs very deeply into the question of “are we running out of space?” The conclusion is that we are not there yet. This does not prevent us from asking the questions though. What about the future of storage and memory? Will businesses rely on cloud storage because on-premise data storage becomes cost-prohibitive? What about personal computing? Will all the future devices emulate the iPad and Surface – dummy devices that only present while storage and compute occur elsewhere?

Saved by DNA?

Breakthrough technology enables data imprinting on and reading from strands of synthetically produced DNA. The technique generating binary storage requires that proteins supplant magnetic orientation or substrate modification.

A Single Gram of DNA can store EVERYTHING.

The quantity of storage is doubling every year so we may need to rethink data consumption vs. storage as a function of digital asset planning. One thing is for certain though, DNA storage would eliminate any concern over rare Earth metals loss.

I have to ask the obvious question: “how different is DNA data storage from our own capabilities for memory?”

Random Memory, Accessed

The Dale Carnegie class finished with a unique exercise. The instructor asked for a volunteer and naturally I raised my hand. She asked if I knew the original 13 colonies. I rattled off a handful, but failed when I said Maine. She then asked if I remembered anything about the string of objects we committed to storage earlier in the session.

Needless to say, I know all 13 colonies AND the order in which they gained Statehood!

That is definitely a random memory, stored in my DNA, accessible at a moment’s notice for the rest of my life. And I won’t need digital storage to save it for me.

After Reviewing the Play, the Call is Reversed.

I make mistakes. Some days I make more than others. I feel bad each time. It’s when I’m feeling bad about the mistake that I remind myself, “It’s ok to make mistakes. I’m going to learn something from them! I’m going to find a way to reverse them!”

Last week, I deleted 2,943 emails from a shared account. I was attempting to execute admin level Powershell commands which I found on the web. It was a simple set of commands that opens the mailbox on the Exchange Server, identifies a filtered set of emails, and then moves them to a designated location. In this case, I wanted to move the emails to another mailbox.

It’s important to know that I had several browser tabs open to various pages of Powershell actions and commands. One allowed me to login, another to gain access to Exchange, another to impersonate a mailbox user, another to copy emails, you get the idea.

I jumped back and forth between the tabs to find the commands I needed for the sequence of actions necessary to execute this simple process.

That’s probably when I copied the emails from the shared -MailboxIdentity and made a -MoveTo and dropped them in the -HardDelete of the other mailbox.

Hard Delete?! HARD DELETE?!

Big Mistake!

If only I had instant replay and could overturn the previous decision…

1978: NFL Instant Replay is Born…. and Dies.

The 1978 Hall of Fame game in Canton, Ohio between the Philadelphia Eagles and Miami Dolphins was an exhibition of skill, class, and technologic futility. This game was the official introduction of Instant Replay to the NFL sporting world. Replay unofficially appeared as a concept two years prior but the ’78 season was its first. Six regular season games featured the use of replay. Yet the lack of quality cameras, a high cost for the technology, and an excruciatingly lengthy replay review forced its immediate cessation.

The NFL 100 Greatest Moments: Instant Replay

A little less than ten years later, the NFL reintroduced Instant Replay. The 1985 season featured a number of games with the replay technology. One needs little reminder that cameras and live televised events were eons behind today’s standards. Take a look at the 1985 Week 5 matchup between the Bears and Bucs. Enjoy the low-def 4:3 aspect ratio in all its historic glory! How could anyone overturn a call from these cameras?

Enjoy an instant classic! The 1985 Bears visited the winless Tampa Bay Buccaneers, losing 12-3 at halftime the eventual Super Bowl champs raced to a 3rd quarter lead. The Bucs rallied to make the final 2 minutes very interesting. The Bears ultimately won the game 27-19

Stats Prove Human Fallibility, but Replay Dies Again

In the six seasons between 1985 and 1991, Instant Replay was utilized 2,967 times in 1,330 games. The overall average replay reversal was 12.6% from an average of 2.2 reviews per game. What is fascinating about the use of replay comes from the yearly increase in reversals. Between 1985 and 1991, reversals increased just about every year – clearly proving that humans could not make the correct call on the field and required advancing technology to assist in judgment. The last three years of Instant Replay in this era demonstrated greater and greater technologic disruption to the human adjudication of NFL games. Given the improving quality but the lingering animosity, the NFL killed off replay – again – after the 1991 season.

No one needed Instant Replay for Scott Norwood’s “Wide Right” kick.

The 90’s, Replay Lies Dormant

Not until the late 1990’s did the NFL revive Instant Replay. In the new format established for 1999, replay rewarded coaches for challenging plays which were overturned and punished them for plays that stood. Not unsurprisingly the number of reversals increased dramatically compared to the replay use in the 80’s. To this day, there remains an increasingly small gain in overall reversals made from replay almost each year. The average overall percent of reversals made in 1999 was 29% vs. 43% in 2016, while the average number of reviews per game barely moved a yard.

When replay was reintroduced in 1999, the equipment was vastly better than 1985 or even 1991. Technology finally provided the clear eye in the sky that replay required to be an integral component of the game’s refereeing of plays. Instant Replay is in widespread use of American sports to this day.

The Replay around the World

The greatest game in the world, football – or soccer to Americans, introduced replay in the early 2010’s. Born out of the Dutch Eredivisie, Video Assisted Referee (VAR) emerged just a mere 30 years after its introduction to football – or throwball to everyone else. Instant replay globalized. Unlike replay in the NFL, where almost every damn play can be reviewed (except in the last 2:00 or in Overtime), VAR can ONLY be used for 4 distinct categories – 1) Goal or No Goal, 2) Penalty Kick or Not, 3) Red Card or Not, 4) Mistaken Identity or Not (players wearing another player’s jersey).

The most popular league in the Western World is the Premier League. Introduced this year, VAR was implemented by all eligible clubs. VAR has undoubtedly changed the outcome of the leagues matches. Unfortunately for Everton fans, Liverpool’s desperation to win the league, something the team has not done since the 1989-1990 season, is well underway with definite support from VAR results in their favour. Could the first season of VAR supplant actual human football in rewarding a champion? The fact is, Liverpool would only be a few points ahead of Man City were it not for replay reversals. Did the clubs make an egregious human mistake when implementing VAR?

The Premier League pundits certainly think VAR is a mistake.

The beauty of replay in the two biggest sports leagues on the planet is that humans invented it. And like all humans making mistakes, sometimes the inventions are a mistake. Sometimes the technology of instant replay caused mistakes. Sometimes the implementation of the technology caused mistakes. But the results from plays being upheld or overturned by instant replay consequently affects the outcomes of very important, albeit expensive, sporting events.

We all make mistakes. Instant Replay is to ensure that our mistakes are not too costly. The technology utilized in VAR and Instant Replay is lightyears beyond 5, 10, or 15 years ago, making it absolutely sufficient for overturning a human error in judgment.

Recycle Bins are Technologists Instant Replay

The cold sweat oozed from my forehead, my pulse quickened, and I lost my hearing for about 5 minutes as my basic instincts took over. I knew that losing 2,943 emails was bad enough, but what would the future hold when I told my executives about the data loss? That’s when I trusted the technology, that’s when I knew the tech was better now than ever before, that’s when I knew a deep recycle bin was there to overturn my mistake.

I calmly entered the Powershell commands to recover the deleted emails, and then took a long walk around the buildings.

Humans make mistakes, but modern and disruptive technologies are helping us reverse the plays!

Been UN-Verified?

Scammers, Public Information, and Opting Out

Stop the Scammers!

On Christmas Eve I received several phone calls from a (626) ###-#### number. No caller ID was presented, and no one left a voice mail. Later in the day, I was called again by the (626) ###-#### number (7 digits left off because the caller generated different sequences, probably using VoIP). Noticing this pattern of randomness, I – out of research of course – took the call. It was a robocall. The message told me that my 1) social security account was compromised, 2) all my social security benefits would be wiped if I did not take action, and 3) to connect to a live person by pressing #1.

Naturally, I pressed #1.

A human voice answered after a brief pause.

Hello, my name is Richard Wells. I am a Social Security Administration representative. With whom am I speaking today?

I gave a fake name. I told them my name was Larry Peterson. The attendant paused for a few seconds, asked how my day was going, and what was the nature of my call.

I heard them typing while I elaborated on the worries regarding all my social security benefits from being wiped. The attendant came back after several seconds and asked, “are you sure your name is Larry Peterson?”

I said, “Yes! Since birth!”

The attendant said, “are you sure your name is not Ben Bird?”

Cybercrimes Against Humanity

Robert Tappan Morris invented the first known computer worm in 1988. He was studying at Cornell, launched the worm from MIT, and was responsible for shutting down multiple computer systems and potentially costing millions of dollars.

Morris inadvertently or nascently started cybercrimes.

In 1999, the Melissa Virus was uncovered. It was the most virulent and rapidly deployed infection to date. It leveraged email attachments to spread its potency. Users would open the email attachments and unknowingly download the payload. Once the payload installed itself on the new client, it would hijack the users Outlook account and send itself to the list of email addresses in the users’ contact list. While the Melissa Virus interrupted businesses and users, it was not intended to extract dollars and euros from its hosts. The Melissa Virus cyberattack deployment method was set in motion, starting a wave of cybercrimes which accounted for BILLIONS of lost and stolen dollars later.

Dawn of Cryptolocking

In 2013, the BBC reported that over 250,000 users were infected with a nasty new infection – Cryptolocker. These infections digitally locked the files and data on its clients and mandated that users pay-up or lose all their files! The first ransomware infections leveraged Trojan Horse style malware, misleading users to act on something nefarious – like downloading a malicious payload or entering authorization credentials to be used against them later.

Cryptolocking & Ransomware are very real and very damaging. To learn more about the dangers of ransomware, check out KrebsOnSecurity, the FBI, or follow along with The Hacker News. To date, ransomware extracted or cost businesses over $11B.

SoCiAl Media & Social Engineering

The rise of social media like Facebook, Twitter, Instagram, Snapchat, WhatsApp, and others exposes users to a whole new galaxy of cybercriminals. These apps and sites provide spaces for users to share pictures of their dogs, influence 14M followers, or spout off political nonsense. On the surface, these sites offer friends and families multiple opportunities to follow along and share in the highs and lows of life. Underneath the surface, all your publicly facing content offers cybercriminals a vantage point into the deepest parts of your lives. Criminals engineer attacks on individuals from the social media apps and sites. The criminals contact you and “act” like they know you, and in many ways they already do.

The crimes derived from social engineering are open attempts to get you to divulge confidential information.

Social engineering is the art of manipulating people so they give up confidential information. The types of information these criminals are seeking can vary, but when individuals are targeted the criminals are usually trying to trick you into giving them your passwords or bank information, or access your computer to secretly install malicious software–that will give them access to your passwords and bank information as well as giving them control over your computer.

https://www.webroot.com/us/en/resources/tips-articles/what-is-social-engineering

The evolving nature of social engineering as a toolkit for malicious intent gave rise to a new scam called Deepfake. The technology of deepfake utilizes publicly accessible images and videos to recreate the voice, face, hair, body movements of one person onto another. Here is an example of an openly deepfake representation of George Lucas talking about Disney’s The Mandalorian.

Sadly, thieves are using deepfake software to emulate the voices of business owners, CEO’s, celebrities, and other ‘influencers’ to scam financial transactions out of companies. Imagine being the CFO of a medium sized company, receiving a call from a local number, hearing your boss on the other line, and being asked to wire $250,000 to a bank account so that “we can get this deal done!”

Aggregating the Data

A quick scan for “public records” turns up 20 – 25 different resources which, for a small fee, will lookup the history of you or someone for whom you are searching. These resources comb the data of public information maintained by local, state, and federal governments as well as the social media sites and apps used by the searched. Many of these public records aggregators like WhitePages, BeenVerified, Spokeo are open to the world. Do a quick ego search on Ancestry or WhitePages and you’ll be very surprised how much information they have already collected on you.

You can even run a reverse phone number lookup to see “Who is Calling You?”

Are you sure your name is…?

Fake Richard Wells from the Social Security Administration may have used a reverse phone number lookup to gather some information about me. In the time it took me to wax woefully on the potential loss of my social security benefits, the attendant probably ran a Premium White Pages lookup on my phone number. The results gave them access to the publicly facing records of my name, address, closest relatives, social media accounts, known locations lived, schools attended, and if they searched Ancestry, probably had access to all my known or suspected relatives long since baptized posthumously by the LDS.

The rest of the conversation was stark. I used some choice words to scold them for attempting to scam people out of confidential information. Fake Richard Wells told me that I had a small…

Opting Out

There are enough ways for cybercriminals to attack you. Emails, viruses, cryptolockers, social media scams, and deepfake are just the tools of the trade. But YOU can defend yourself. Email security, strong passwords, multi-factor authentication, endpoint security, VPNs’, firewalls, etc. are the frameworks for defending against the attacks. While you may not totally prevent the attackers from getting your information, you can certainly do your best to slow them down.

I surmised that Fake Richard Wells used on-line public records & data aggregators to reverse look me up. And that’s when I found these, opt-out links for the top public information aggregators:

Been Verifiedhttps://www.beenverified.com/app/optout/search

Spokeohttps://www.spokeo.com/opt_out/new

WhitePageshttps://www.whitepages.com/suppression_requests

There are many more aggregators from which I need to opt-out. But I’ve already lowered my “Spam Score” by 45%.

Opt-Out of Major Aggregators

Take the time to opt-out of these data aggregators to reduce the amount of public information accessible to the cybercriminals. They’re looking for a quick turnaround and will not waste their time on vigilant and aware suspects.

Be Safe!

Needles, Stockings, and Early Manufacturing Tech

1589

William Lee invented the first knitting machine. The mechanical device utilized a wooden frame, a set of needles, and control arms for the operator to raise and lower the needles and thread so as to generate the loops and hooks necessary for knitting. He presented his instrument to Queen Elizabeth I in hopes of earning a patent, while the Queen loved her knitted stockings and probably preferred the feel of silk over the rough hewn garments produced by the machine, she declined (Adapted from A History of Hand Knitting, Richard Rutt, page 67). The Queen’s decline was actually attributed to the harm his manufacturing technology might do to her citizens (Why nations fail: the origins of power, prosperity, and poverty, Acemoglu, D.; Robinson, J., page 182). Perhaps, Lee’s invention was the first manufacturing tech to raise fears about losing jobs?

Stocking-Frame from the Ruddington Framework Knitters Museum.

Lee took his invention to France were he was given a patent on his stocking-frame. He and a small team operated a knitting business until his death in 1610. His brother returned to London and setup a silk knitting shop. Ten years later the stocking-frame was utilized in many parts of England. And by 1638, the use of fulling mills made its way to the British Colonies. It’s possible to discern that Lee’s invention generated a boom in fabric machines and fabric societies. These societies consisted of subdivisions of larger and older guilds.

William Lee and his Stocking-Frame make up the seal of the Worshipful Company of Framework Knitters.

Over the course of the next 50 years, the knitting industry flourished with the use of the mechanical stocking-frame. There are few references to Lee’s invention as a catalyst for the Industrial Revolution, but its proximity to other garment industry inventions and processes makes the Stocking-Frame a strong candidate.

Digital Disruptors

Considered the first fully integrated and programmable digital computer, ENIAC closes in on its 75th anniversary.

ENIAC – the first digital disruption

The ENIAC pioneered the digital age, disrupting military analytics and simulation by empowering engineers to calculate multiple ballistic trajectories in unison. What made ENIAC so powerful was its ability to store complex decimal numbers in memory for subsequent use.

undefined

The builders of ENIAC were John Mauchly and J. Presper Eckert, John von Neumann programmed it while Arthur Burks was added to the project as a general designer. All of them contributed to a robust digital expansion after the ENIAC. von Neumann split with Mauchly and Eckert at the conclusion of the project, vowing that binary arithmetic and not decimal numbers was the faster path forward. His work with the ENIAC formed the basis of the von Neumann architecture, on which all future computer models were framed.

von Neumann Architecture

John von Neumann was also a proponent of education. He pushed for more math and science in schools to develop future leaders in the space.

von Neumann was outspoken, disruptive, and a challenger. He pursued excellence in himself and his teams.

IAS Computer – the original von Neumann machine

Built at the tail-end of the ENIAC, the Institute for Advance Study Computer (IAS) was the first von Neumann machine. The goal of the machine was to make computing more accessible to students, scientists, engineers, and the US military. The work on IAS led to the creation of a magnetic drum for memory storage. The device was implemented in the IBM 701 computer.

Within the decade following the ENIAC, von Neumann influenced computer design and architecture – considerably disrupting and antiquating each predecessor.

John von Neumann died in 1957, aged 53. He developed terminal cancer, possibly linked to his work with the Manhattan Project. Just before his death, von Neumann described the rapidity with which technology evolved, suggesting that singularity would occur in the future.

“The technology that is now developing and that will dominate the next decades is in conflict with traditional, and, in the main, momentarily still valid, geographical and political units and concepts. This is a maturing crisis of technology… The most hopeful answer is that the human species has been subjected to similar tests before and it seems to have a congenital ability to come through, after varying amounts of trouble.”

—von Neumann, 1955

The Technological Singularity by Murray Shanahan, (MIT Press, 2015), page 233

While the ENIAC owns the distinction of the world’s first digital general-use computer, it was not the first digital computer. That distinction belongs to Colossus.

We’ll dig into that disruption soon!