General > General Technical Chat
Are chatgpt threads taboo? This isn't bad.
(1/1)
paulca:
https://gitlab.com/paulcam/home_heating/-/blob/master/mqtthub/future_processor.py
--- Code: ---from threading import Thread
import copy
import logging
from libs.master_clock import MasterClock
from libs.logging_setup import setup_logging
setup_logging()
module_logger = logging.getLogger(__name__)
module_logger.setLevel(logging.DEBUG)
class FutureRequestProcessor:
def __init__(self, request_queue):
self.request_queue = request_queue
self.handlers = []
self.thread = Thread(target=self.run)
self.thread.start()
self.logger = module_logger.getChild(self.__class__.__name__)
self.logger.addHandler(logging.NullHandler())
def run(self):
while True:
request = self.request_queue.get()
self.logger.debug("Got request: {0!s}".format(request))
for handler in self.handlers:
outbound_request = handler.handle(request)
if outbound_request:
self.logger.debug("Sending outbound request: {0!s}".format(outbound_request))
def publish_request(self, request):
request_json = request.to_json()
topic = self.request_topic_root+"/"+request.data["zone"]
self.cache.client.publish_retained(topic, request_json)
class RampHandler:
def __init__(self, zone, ramp_gradient):
self.zone = zone
self.ramp_gradient = ramp_gradient
def handle(self, request):
if request.data["zone"] != self.zone:
return
if "metadata" not in request.data:
return # can't compute ramp without target
metadata = request.data["metadata"]
if "targetTemp" not in metadata:
return # can't compute ramp without target
if "currentTemp" not in metadata:
return # can't compute ramp without current temp
future = request.data["timestamp"]
current_temp = metadata["currentTemp"]
target_temp = metadata["targetTemp"]
now = MasterClock.instance.get_current_time()
request_gradient = self.compute_ramp(future, now, current_temp, target_temp)
if request_gradient >= self.ramp_gradient:
# should start to ramp up now.
request_target = self.compute_ramp_point_temp(target_temp, now, future)
metadata["targetTemp"] = request_target
metadata["future"] = copy.deepcopy(request.data)
return request
return None
def compute_ramp_point_temp(self, target, now, future):
future_y = target
future_x = future
now_x = now
delta_x = future_x - now_x
delta_y = (delta_x/3600) * self.ramp_gradient
now_y = future_y - delta_y
# TODO Does this assuming heating?
# Oh fuck off. I live in Ireland.
return now_y
def compute_ramp(self, future, now, current, target):
delta = target - current
if delta < 0:
return # negative shouldn't happen??? unless it's cooling
time_delta = future - now
request_gradient = float(delta) / float(time_delta)
request_gradient_degrees_per_hour = float(delta) / float((time_delta / 3600))
return request_gradient_degrees_per_hour
--- End code ---
Can you tell me what some code does?
--- Quote ---Certainly! Please provide the code you would like me to explain.
--- End quote ---
<Paste example above.>
--- Quote ---This is a Python code that defines two classes: FutureRequestProcessor and RampHandler. The FutureRequestProcessor class is responsible for processing future requests by creating a thread and continually getting requests from a queue. It has a run() method that is called when the thread starts and inside the loop, it gets a request from the queue and passes it to each of its handlers to be handled. If any handler returns an outbound request, it logs the request and sends it.
The RampHandler class is a handler that computes and updates the target temperature for a zone to achieve a ramp up/down effect, given the current and target temperatures, and the timestamp for the future temperature change. If the difference between the current and target temperatures is greater than or equal to the ramp gradient set for the handler, it computes the new target temperature based on the current temperature, target temperature, ramp gradient, and time difference. The handle() method returns the updated request with the new target temperature or None if no update is needed.
There are also some imports and a setup function call for logging. However, since I don't have access to the rest of the code, I cannot determine the exact use case of these classes or how they are being used in the larger program.
--- End quote ---
Sounds epic. Sounds about right. But is it correct?
Nominal Animal:
--- Quote from: paulca on February 24, 2023, 08:08:40 pm ---Sounds epic. Sounds about right. But is it correct?
--- End quote ---
Ask ChatGPT.
SiliconWizard:
--- Quote from: Nominal Animal on February 25, 2023, 05:11:53 pm ---
--- Quote from: paulca on February 24, 2023, 08:08:40 pm ---Sounds epic. Sounds about right. But is it correct?
--- End quote ---
Ask ChatGPT.
--- End quote ---
;D
Psi:
I think a thread simply about asking ChatGPT something has probably had it's time now.
Unless it's something actually novel and new.
But I think it's fine to post a chatGPT answer to something in an existing thread. The modern day equivalent of "let me google that for you". hehe
xrunner:
How about how to build ChatGPT? I think this is pretty interesting. It's nearly 2 hours long but I found I understood much more about it by watching the first 20 minutes or so of the coding process.
We build a Generatively Pretrained Transformer (GPT), following the paper "Attention is All You Need" and OpenAI's GPT-2 / GPT-3.
Navigation
[0] Message Index
Go to full version