I'm working on a little arduino-like project that involves a motion detector with customizable sensitivity. The ambition here is that I could get a PIR sensor wired up that would give an analog-esque output depending on how much motion it detects. I know standard PIR sensors have a potentiometer on them for sensitivity, and another for delay. My idea is to remove both of those potentiometers and instead have both the delay and the sensitivity handled by software that I'm working on. In other words, the PIR would output an analog voltage value based on the motion it detects, and that value would be read in by a device, compared against a programmable threshold, and then a decision would be reached as to whether enough 'motion' had been achieved to set a flag high. When this occurs, the program itself would also handle the delay time (for example, how long to leave a light on). Removing all potentiometers and making it purely software dependent for customization is the goal here. I'm looking at a few schematics and pictures of PIR sensors and I believe at some point that the PIR itself is at some point putting out an analog signal, which is then being checked with a comparator. With that idea in mind, I think it's possible to cut out the extraneous stuff on these sensors and simply have the PIR's analog output data constantly being sent. If I can cut out the potentiometers, it'll save a little on the cost side and it'll also make it easier for a potential user to change customization via the control program and not have to fiddle with hardware components. How would I go about crafting a circuit that is constant analog output for a PIR sensor?