Politician Translator with Spacy and Negate
Find out what politicians really mean by changing their text to the opposite.

This is a short post. All the time we hear these politicians talking but most of the time they mean the opposite. For example, if a politician says that he will lower the taxes, the taxes will go up. If a politician says he did not a relationship with that woman, then ... Etc.
So I thought, why not make a Politician Translator in Python? In this post, I start with the results. The code is at the end.
The Politician Translator in action
Joe Biden on 9 January 2023 (Twitter):
The first two years of my presidency were the two strongest years of job growth on record.
These historic jobs and unemployment gains are giving workers more power and American families more breathing room.
Politician Translator:
The first two years of my presidency were not the two strongest years of job growth on record.
These historic jobs and unemployment gains are not giving workers more power and American families more breathing room.
Bill Clinton 15 Years Ago:
I did not have sexual relations with that woman, Miss Lewinsky.
Politician Translator:
I had sexual relations with that woman, Miss Lewinsky.
Joe Biden on 6 January 2023 (Twitter):
My economic plan has always been to grow our economy from the bottom up and the middle out.
Today we learned that unemployment is at a 50-year low after the two strongest years of job growth ever.
We're creating jobs. We're lowering costs. Our plan is working.
Politician Translator:
My economic plan has not always been to grow our economy from the bottom up and the middle out.
Today we did not learn that unemployment is not at a 50 - year low after the two strongest years of job growth ever.
We are not creating jobs. We are not lowering costs. Our plan is not working.
Don't you love it!
The Politician Translator can also de used to train politicians.
For example, they must say:
I like peanut butter and eat it every day.
When they really mean:
I do not like peanut butter and do not eat it every day.
Another one. They must say:
My first birthday was great. My 2. was even better.
When they mean:
My first birthday was not great. My 2 . was not even better.
Some details about the code
I got this idea and was first looking at using antonymes. Too complex for a short project, maybe in a next version. Then I found the Python package 'Negate'. Very nice, but it can not handle compound sentences.
So I looked for a way to break the sentences into parts. To do this we use the POS-tags of a sentence. Although very primitive, it will work in many cases.
Once we have the parts of a sentence, we run 'Negate' on them, and glue them together again. Then some fix up, and done.
Politician Translator: The code
# your text
text = """The first two years of my presidency were the two strongest years of job growth on record.
These historic jobs and unemployment gains are giving workers more power and American families more breathing room.
"""
text = """I did not have sexual relations with that woman, Miss Lewinsky."""
#text = """I like peanut butter and eat it every day."""
#text = """My first birthday was great. My 2. was even better."""
import re
from negate import Negator
import spacy
from spacy.lang.en import English
class TextToSentences:
def __init__(self):
self.nlp = spacy.load('en_core_web_sm')
self.nlp.add_pipe('sentencizer')
def get_sentences(self, text):
doc = self.nlp(text)
return [sent.text.strip() for sent in doc.sents]
class SentenceToParts:
def __init__(self):
self.nlp = spacy.load('en_core_web_sm')
def get_parts(self, sentence):
doc = self.nlp(sentence)
parts, words = [], []
for token in doc:
if token.pos_ in ['CCONJ', 'SCONJ']:
parts.append(' '.join(words))
words = []
words.append(token.text)
if len(words) > 0:
parts.append(' '.join(words))
return parts
class NegateParts:
def __init__(self):
self.negator = Negator(use_transformers=True)
def get_negated_parts(self, parts):
negated_parts = []
for part in parts:
negated_parts.append(self.negator.negate_sentence(part, prefer_contractions=False))
return negated_parts
class StitchParts:
def __init__(self):
self.nlp = spacy.load('en_core_web_sm')
def get_stitched_parts(self, parts):
stitched_parts_items = []
for i, part in enumerate(parts):
# first word to lowercase if not PROPN
doc = self.nlp(part)
words = []
for j, token in enumerate(doc):
word = token.text
if i > 0 and j == 0 and token.pos not in ['PROPN']:
word = word.lower()
words.append(word)
stitched_parts_items.append(' '.join(words))
return ' '.join(stitched_parts_items)
class FixUpSentence:
def __init__(self):
pass
def get_fixedup_sentence(self, sentence):
# trim
sentence = sentence.strip()
# fix: end of line ' .'
sentence = re.sub(r'\s\.$', '.', sentence)
# fix: ' ’s'
sentence = sentence.replace(' ’s', '\'s')
# fix: ' , '
sentence = sentence.replace(' , ', ', ')
return sentence
tts = TextToSentences()
stp = SentenceToParts()
np = NegateParts()
sp = StitchParts()
fus = FixUpSentence()
# step 1: split text into sentences
sentences = tts.get_sentences(text)
ftext_items = []
for sentence in sentences:
# step 2.1: split sentence into sub-sentences
parts = stp.get_parts(sentence)
# step 2.2: negate sub-sentences
nparts = np.get_negated_parts(parts)
# step 2.3: create sentences from sub-sentences
nsentence = sp.get_stitched_parts(nparts)
#print('nsentence = {}'.format(nsentence))
fsentence = fus.get_fixedup_sentence(nsentence)
# step 2.4: Remove spaces etc.
ftext_items.append(fsentence)
# step 3: join sentences
ftext = ' '.join(ftext_items)
print('what they say = \n{}'.format(text))
print('what they mean = \n{}'.format(ftext))
Summary
I could say that I did not enjoy this short project, but then you would use the Politician Translator and know what I really mean ... ;-)
Links / credits
How to break up document by sentences with Spacy
https://stackoverflow.com/questions/46290313/how-to-break-up-document-by-sentences-with-spacy
Negate
https://pypi.org/project/negate
Tutorial on Spacy Part of Speech (POS) Tagging
https://machinelearningknowledge.ai/tutorial-on-spacy-part-of-speech-pos-tagging
Read more
Machine Learning
Recent
- Collect and block IP addresses with ipset and Python
- How to cancel tasks with Python Asynchronous IO (AsyncIO)
- Run a Docker command inside a Docker Cron container
- Creating a Captcha with Flask, WTForms, SQLAlchemy, SQLite
- Multiprocessing, file locking, SQLite and testing
- Sending messages to Slack using chat_postMessage
Most viewed
- Flask RESTful API request parameter validation with Marshmallow schemas
- Using UUIDs instead of Integer Autoincrement Primary Keys with SQLAlchemy and MariaDb
- Using Python's pyOpenSSL to verify SSL certificates downloaded from a host
- Connect to a service on a Docker host from a Docker container
- Using PyInstaller and Cython to create a Python executable
- SQLAlchemy: Using Cascade Deletes to delete related objects