Infrared Decoder and Transmitter

Submitted by Matt on Thu, 10/15/2015 - 16:10

My man cave over the garage has a mini split installation. In case you are not familiar with mini splits the DOE has an article.

TLDR?

These AC units are common all over the world, except the USA where central air dominates. They are more efficient and easy to install (no ductwork) but tend to be more expensive per BTU.

Through a connection that my father in law has, I bought ading and dent unit for an eighth of the normal price. The unit is a Mitsubishi MSZ-FE18NA with 18k BTUs of cooling/heating capacity. I figured 18k BTUs was ample capacity to stave off the intense Texas summer heat.

Turns out it may have been a bad idea to get a unit this powerful. Apparently these mini split systems never “turn off”; they always run at a percentage of ther capacity so they can allegedly be more efficient by eliminating the powering on and off overhead. Don’t ask me how this ends up working out in the end! Once the set temperature is acheived the indoor radiator is always about the same temperature as the air. So there is hardly ever any condensation that forms on it. Even worse, any condensation that does form  quickly re evaporates. So my cave potentially has a humidity of +20% the outside since the temperature is usually cooler.
In Houston Texas this is bad news because the humidity is high enough!

I researched the issue online. Tons of people are having the same problem. One solution is to downgrade to a smaller unit. Well my unit is already installed, I would have to trade units with someone else and a reinstallation would not be cheap. I had to think outside the box. A temporary fix seemed to be turning the AC off and on manually. When the AC unit detects a significant temperature difference between the set temperature and room temperature it goes into a turbo mode where then the radiator part is significantly cooler and humidity is rapidly pulled out of the air.

The AC unit is controlled with an infrared remote. So that got me thinking, using a simple microcontroller, assuming I could reverse engineer the digital codes the remote were sending to the AC, I could create my own remote that turns the AC on and off every half hour, or even make it temperature based. This way, the radiator part of the AC was always going to be pulling out humidity.

Solution:

So after doing a little bit of research and looking around the house, I had found I had all the parts I needed. I had an infrared receiver circuit from a broken HDMI switch. For the infrared LEDs I had a pile of old Comcast remotes (they give you a new one every time you get a cable box), so dismantled one and got a few LEDs. To do a proof of concept, I hooked up my infrared receiver circuit to my oscilloscope, set a trigger and then saw the waveform (square wave) of any infrared remote I fired in the vicinity on the screen of the
scope. Great, this is possible so far!

The Project:

For the controller, I wanted to use an Atmel ATMega as with my other projects, but I had never tried to use a Raspberry Pi before and figured this was a good opportunity to learn some cool stuff.  My first goal was to get the codes for on, off, temp up, temp down etc.  I needed to write a program that would listen to the waveform output by the receiver, record it and interpret it. Hopefully the data would be like any other transmission and would have a baud/clock rate to make the values easy to determine

I decided to write all my code in C after starting in python and being unsatisfied with the performance (see paragraphs at the very bottom for details). I wrote code to record the time in clock cycles when the input changed. The times were not completely uniform. To completely eliminate that error I apparently needed to write a kernel module. I did not want to open that can of worms yet for this project, so I just figured I would run like a few samplings to eliminate the error. I collected the data in Excel for analyisis. I figured out the time value that must be the baud rate.

I still need to write a program to decode it into digital values. I did this in C# because it was quick in easy. The C# program took the recorded times en mass via a text file and output a header file that I could compile into my C program for retransmission.

When designing my output circuit I had to be careful: the pins on the Pi are logic level 3.3V.  In addition the pins are current limited to low double digit mA. I could not properly drive the LED directly from the Pi so I needed a low gate threshold MOSFET.

[Image Here]

I wrote some code to output a 38 kHz carrier wave which causes an active low on IR receivers. This modulates the values. It was tricky getting a PWM output on the raspberry pi because of the software layers. An AVR microcontroller would have definitely been easier to work with at this point. I was very careful to write code that had as few delays as possible. And even then still it was tenouos because the OS could interrupt at any time and skew the outpit. An interrput based solution would have worked better but this was very difficult to do to on the Pi. Again, would have been better off with the AVR at this point

Why I did not use Python:

I had never really used Python before, and that seems to be the language of choice for the Raspberry Pi. I was initially very pleased about how easy it was to get up and running. I could set up the GPIOs, timers all kinds of stuff and run it on the fly. However, I kept getting bogus results. For example, times that were too long, too short, impossible. I guess python is not all that it has cracked up to be for the Pi. Since my program was running at the mercy of the interpreter, which in turn is running at the mercy of the OS, this essentially makes running things that are highly time sensitive impossible.

I have always wondered why lots of modern programmers like to use python so much. They try and use it for everything. Yes it is easy to learn and run, but it has many limitations. The first of which his that many runtime environments are NOT backwards compatible. Additionally if the developer does not check the runtime environment prior (they never do :P) you get a highly cryptic error message that is almost impossible to source, at least for someone who does not code with it all the time. Second major shortcoming is that it runs very inefficiently. The executables (frozen) are huge. Single threaded performance is piss poor. This would be all fine and good
if it was consigned to applications that suit it best (scripting). But it seems like people try and solve all kinds of problems with it these days, spend all this effort getting it to work right, when another language would have saved them that time and they would have ended up with a better product. Third issue is that people who write in it are pretty much ignorant of the hardware/OS they are on. This can  be a strength to a point, but there are lots of gotchas to that.