0x41
(ascii for 1), but other times it’
a bit worse like having a BCC where you XOR every byte in the message bytes, but not the header bytes
because those are always the same and not the footer bytes because that’s where the BCC goes! Like okay
sure, but with one device there wasn’t even an ID which means that every possible command and its corresponding
BCC is known ahead of time, so why not just include it in your docs?1arecord -D default -f S24_3LE -c 2 -r 48000 - | \
2ffmpeg -c:a pcm_s24le -i - -af anlmdn=s=4 -c:a pcm_s32le -f wav - | \
3ffmpeg -hwaccel cuda -hwaccel_output_format cuda -f x11grab -s 1920x1080 -r 60 -i :0.0 \
4-f v4l2 -input_format mjpeg -s 640x480 -c:v mjpeg_cuvid -i /dev/video0 \
5-i - -f alsa -i looprec \
6-filter_complex "[0:v][1:v]overlay=main_w-overlay_w-10:main_h-overlay_h-10[v];[2:a][3:a]amerge[a]" -map "[v]" -map "[a]" \
7-c:v h264_nvenc -profile:v high -tune ll -preset p7 -b:v 6M -bufsize 3M -g 240 -c:a aac -b:a 128k -ar 44100 \
8-f flv "rtmp://live.twitch.tv/app/live_xxxxxxxxx_xxxxxxxxxxxxxxxxxxxxxx"
1-f x11grab -s 1920x1080 -r 60 -i :0.0
1-f v4l2 -input_format mjpeg -s 640x480 -c:v mjpeg_cuvid -i /dev/video0
1v4l2-ctl --list-devices
1[relue:~]> v4l2-ctl --list-devices
2HD Web Camera: HD Web Camera (usb-0000:07:00.4-1.2):
3 /dev/video4
4 /dev/video5
5 /dev/media2
6
7USB2.0 HD UVC WebCam: USB2.0 HD (usb-0000:08:00.0-1):
8 /dev/video0
9 /dev/video1
10 /dev/video2
11 /dev/video3
12 /dev/media0
13 /dev/media1
1v4l2-ctl --device=/dev/video0 --list-formats-ext
1ffmpeg -codecs | grep [v4l2/$format]
1 DEV.LS h264 H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 (decoders: h264 h264_v4l2m2m h264_cuvid ) (encoders: libx264 libx264rgb h264_nvenc h264_v4l2m2m h264_vaapi nvenc nvenc_h264 )
1arecord -D default -f S24_3LE -c 2 -r 48000 - | \
2ffmpeg -c:a pcm_s24le -i - -af anlmdn=s=4 -c:a pcm_s32le -f wav - | \
3ffmpeg ... -i - ...
1-f alsa -i looprec
1-filter_complex "[0:v][1:v]overlay=main_w-overlay_w-10:main_h-overlay_h-10[v];[2:a][3:a]amerge[a]" \
2-map "[v]" -map "[a]"
1[1:a][2:a][3:a][4:a]...amerge[a]
1-map "[v]" -map "[a]"
1-c:v h264_nvenc -profile:v high -tune ll -preset p7 -b:v 6M -bufsize 3M -g 240 -c:a aac -b:a 128k -ar 44100
1-f flv "rtmp://live.twitch.tv/app/live_xxxxxxxxx_xxxxxxxxxxxxxxxxxxxxxx"
-rtmp_live 1
flag which is introduced in ffmpeg version 6.0, and allows for more codec compatability, but still only single streams. For better support if using your own server, look into using rtsp or webrtc.
Some more boiler plate for hls, rtsp and srt are below:1f hls -hls_time 10 -hls_list_size 4 -hls_flags delete_segments -hls_segment_filename "segment_%v_%03d.ts" -hls_base_url "http://tv.reluekiss.com:8888" "https://user@password@tv.reluekiss.com:8888/mystream"
2-f rtsp "rtsp://user@password@tv.reluekiss.com:1735/mystream/mystream"
3-f mpegts srt://tv.reluekiss.com:1735?streamid=mystream:mystream:user:password&pkt_size=1316
r.Context()
correctly, so
I’m basically done except for all of the annoying parts. First there’s password resets, but then I want to do fancy
stuff like TOTP 2fa tokens, and after that oauth/passkeys, and then after that become an oauth provider. I think doing
that is the final boss. Maybe API keys after that? But that’s relatively harmless. I’m looking for ways to abstract
the auth package to the most reasonable degree. I want it to be as easy as the Auth.js/auth0 people, more on that later.
I absolutely hate how many articles and tutorials
there are online of software that does not pass the “am I, the author, willing to put this into a production service”
test. In with auth there should not be handwaves for this material. Either it’s
golden or it’s unusable, so tutorials that say “well in the real world you wouldn’t actually want to do this” makes their content useless.GET /profile/yogibear61
request, the Go server responds with a fully complete html document, what really
happens is I render, say, just the <body>
and htmx hoists the response into the body.
This way there’s not a full page reload with a potential flash of unstyled content.
The header, footer, etc, continues to exist, they aren’t destroyed, it’s just that the <body>
tag gets
replaced. This I think would help make the page feel faster and whatnot, but I’m not sure how legal that is in my mind
yet.<script>alert("pwned")</script>
,
and it actually runs, yeah you have some security implications. Just sanitize your inputs, on our blog site we do this
with the comments and it’s literally one function.uuid.UUID
type. But sqlite3
does not have such a type, it only has text
. So, my db package would wrap both of these calls and the application
would give whatever is most convenient (in this case, a uuid.New()
) and db would disperse this information to
what’s correct for the database. The problem with this is the synchronization and return types. Basically, the only
way to make it all work is if I made everything in postgres whatever sqlite3 can handle, and from which database
should I return? What if I was writing to them both at the same time? This issue is solvable, but as I was doing it
I realized it required an amount of effort that wasn’t worth it. If you’re curious, you can go
here
to figure out how I was trying to crack this egg. It is not pretty, but you’re going to be looking in src/db
src/auth
, it’s not as simple as taking on auth from on high. For example, I have it
so the SignUp/SignIn struct has a method of signature RenderErrs() []string
so when I’m rendering the html, I pass that
struct into the templ templating, and within the template if there is anything within any of the errors, it will show them
in little boxes on the page. If SignUp.UsernameErr exists, it will make the border color of the username box red. How could
you make that into a package if you didn’t own or understand the contents of the struct? The jwt boilerplate I think is
production-ready, so that will probably be the first part in my auth series, but that one file alone probably isn’t worth
installing. Just copy it! If you look at it and you think it’s good, copy it now! I like it, and if you don’t then
please cut an issue because I want to get a second opinion. One cool gimmick I’m going to pull with my auth series is that
when I feel comfortable enough to share it, I’m going to pay real cash-money to pay a web security professional to do a code
tour and pentest on my auth system. I want to get real insight on what I personally overlooked, and what common mistakes are.
This step is important because the number one reason I hear online of why someone should never, ever roll their own auth is
because there are “so many” footguns you can find yourself in, you should really leave auth to people who dedicate their
entire company towards security. But, to me, if you own your auth, and it Just Works, then you have it forever. You can
microservice the heck out of it, and as long as you don’t have severe skill issues, it will still Just Work. Auth is a concept,
so if you have it done correctly then it will Just Work forever. That is, unless passwords really become out of fashion
and you have to move to passkeys, but I’ll have that covered to!\[[0, x_0) \to [(1-t)x_0,x_0)\]
\[(x_0, 1] \to (x_0, x_0 + t(1- x_0)]\]
\[ H_t = \begin{cases} r_{1-2t}, & 0 \leq t \leq 1/2 \\ s_{2t-1}, & 1/2 \leq t \leq 1 \end{cases} \]
\[s_t : \{r\}_i \times [0, 1 - r]_i \mapsto \{r\}_i \times [0, t (1 - r)]_i, \forall t \in I, i \in \mathbb{Z}\]
1Table 7(a) HEVC 1920x1080p BD-PSNR
2| | Bitrate Savings relative to |
3| :-----: | :-----: | :-----: | :-----: |
4| | SVT-AV1 | X265 | VP9 |
5| VVC | 49.8% | 67% | 75.8% |
6| SVT-AV1 | - | 32.6% | 51% |
7| X265 | - | - | 27.6% |
8
9Table 7(b) HEVC 1920x1080p BD-VMAF
10| | Bitrate Savings relative to |
11| :-----: | :-----: | :-----: | :-----: |
12| | SVT-AV1 | X265 | VP9 |
13| VVC | 54.2% | 59.8% | 67.8% |
14| SVT-AV1 | - | 13.73% | 26.77% |
15| X265 | - | - | 17.84% |
1ffmpeg -i input.mkv -c:v libvpx-vp9 -pix_fmt yuv420p10le -pass 1 -quality good -threads 4 -profile:v 2 -lag-in-frames 25 -crf 25 -b:v 0 -g 240 -cpu-used 4 -auto-alt-ref 1 -arnr-maxframes 7 -arnr-strength 4 -aq-mode 0 -tile-rows 0 -tile-columns 1 -enable-tpl 1 -row-mt 1 -f null -
2ffmpeg -i input.mkv -c:v libvpx-vp9 -pix_fmt yuv420p10le -pass 2 -quality good -threads 4 -profile:v 2 -lag-in-frames 25 -crf 25 -b:v 0 -g 240 -cpu-used 4 -auto-alt-ref 1 -arnr-maxframes 7 -arnr-strength 4 -aq-mode 0 -tile-rows 0 -tile-columns 1 -enable-tpl 1 -row-mt 1 output.mkv
1ffmpeg -i input.mkv -c:a copy -c:v libsvtav1 -pix_fmt yuv420p10le -pass 1 -preset 5 -crf 26 -g 240 -pix_fmt yuv420p10le -svtav1-params -tile_columns 1 -tile_rows 0 tune=0 -f null -
2ffmpeg -i input.mkv -c:a copy -c:v libsvtav1 -pix_fmt yuv420p10le -pass 1 -preset 5 -crf 26 -g 240 -pix_fmt yuv420p10le -svtav1-params -tile_columns 1 -tile_rows 0 tune=0 output.mkv
1House M.D., Season 3 Episode 1:
2
3 Dr. Wilson: The fifth level of happiness involves creation, changing lives.
4
5 Dr. House: The sixth level is heroin, the seventh level is you going away.
/gh/
link at the top of the page. Natalie wants
to make it so you can do “>>number” to reference a previous comment, and I’ll get around to that at some point. As well
as paginating the comments so if there’s more than X, there will be a Page 2 with another X number of comments.
Natalie and I have really fun ideas for the website. I want to do things like have an email list, have a comments section, make an email server for @reluekiss.com, and other stuff. The problem is Astro (the JS framework that makes this website now) is not Real Programming. I enjoy Real Programming. When I run into a problem and solve it, I can rely on my previous Real Programming skills to help me, and afterwards I add that problem to my stack of Real Programming skills. When I have an issue with Astro, like in theimport.meta.glob
situation, there is not a Real Programming skill that will tell me what the right answer is. All of this to say, I’m rewriting the website in Go here soon, and you can bet your bottom dollar I’ll be making posts about the process. Specifically, hosting SSG vs SSR (the normal way) is something I’m super curious about.
import.meta.glob
, I get frustrated at myself for not being able to express what
I want. Writing is hard! Like, really hard. And I think that a video where I do a code
tour is a much simpler way to teach. Other posts, like my Abstraction
Essay don’t click in my head as a video. I
think I wrote it really well, and it is very much an essay! Video and writing are not
mutually exclusive.1#01
2pcm.!default {
3 pcm "hw:0,0"
4}
5
6ctl.!default "hw:0"
aplay -l
, also, if
you ever wish to change the order of them you can do so by editing the
/etc/modprobe.d/alsa-base.conf file, explaning what occurs beyond indexes is
outside of the scope of this. But for an example it could look something like
mine1options snd-hda-intel index=0
2options sof-intel-dspcfg dsp_driver=3 index=1
3options snd_usb_audio index=2
4options snd-aloop index=4 enable=1 pcm_substreams=4 id=Loopback
1#02
2pcm.!default {
3 type asym
4 playback.pcm "dmixed"
5 capture.pcm "dsnooped"
6}
7
8pcm.dmixed {
9 type dmix
10 ipc_key 1024
11 ipc_key_add_uid 0
12 slave {
13 pcm "hw:1,0"
14 period_time 0
15 period_size 1024
16 buffer_size 4096
17 channels 2
18 }
19 bindings {
20 0 0
21 1 1
22 }
23}
24
25pcm.dsnooped {
26 type dsnoop
27 ipc_key 1025
28 slave {
29 pcm "hw:1,7"
30 period_time 0
31 period_size 1024
32 buffer_size 4096
33 channels 2
34 }
35 bindings {
36 0 0
37 1 1
38 }
39}
1#03
2pcm.dsnooped {
3 type dsnoop
4 ipc_key 1025
5 slave {
6 pcm "hw:1,7"
7 rate 44100 #48000
8 channels 2
9 }
10 bindings {
11 0 0
12 1 1
13 }
14}
1sudo xbps-install snd-aloop
1#04
2pcm.dmixerloop {
3 type dmix
4 ipc_key 2048
5 ipc_perm 0666 # allow all users read write permissions
6 slave.pcm "hw:Loopback,0,0"
7 slave {
8 period_time 0
9 period_size 1024
10 buffer_size 4096
11 channels 2 # must match bindings
12 }
13 bindings {
14 0 0
15 1 1
16 }
17}
18
19pcm.out {
20 type plug
21 route_policy "duplicate"
22 slave.pcm {
23 type multi
24 slaves {
25 a { channels 2 pcm "dmixed" }
26 b { channels 2 pcm "dmixerloop" }
27 }
28 bindings {
29 0 { slave a channel 0 }
30 1 { slave a channel 1 }
31 2 { slave b channel 0 }
32 3 { slave b channel 1 }
33 }
34 }
35 ttable [
36 [ 1 0 1 0 ]
37 [ 0 1 0 1 ]
38 ]
39}
40
41pcm.looprec {
42 type dsnoop
43 ipc_key 2049
44 ipc_key_add_uid 0
45 slave {
46 pcm "hw:Loopback,1,0"
47 period_time 0
48 period_size 1024
49 buffer_size 4096
50 channels 2
51 }
52 bindings {
53 0 0
54 1 1
55 }
56}
1Loopback,0,0 <-> Loopback,1,0
2Loopback,0,1 <-> Loopback,1,1
3Loopback,0,2 <-> Loopnack,1,2
1pcm.softmixer {
2 type softvol
3 slave.pcm "out"
4 control.name "PCM"
5 control.card 1
6}
1device "XX:XX:XX:XX:XX:XX"
2profile "a2dp"
3service "org.bluealsa"
4ctl.device bluealsa
1pcm.blueout {
2 type plug
3 slave {
4 pcm {
5 type dmix
6 ipc_key 1026
7 slave {
8 pcm {
9 type hw
10 card "Loopback"
11 device 0
12 subdevice 1
13 }
14 }
15 }
16 }
17}
1$ mkdir -p /etc/sv/alsaloop
2$ vim /etc/sv/alsaloop/run
3
4#!/bin/sh
5
6exec >/var/log/alsaloop.log 2>&1
7
8exec alsaloop -C looprec -P bluealsa:DEV=XX:XX:XX:XX:XX:XX,PROFILE=a2dp -c2 -fs16_le -t 20000
9
10ln -s /etc/sv/alsaloop /var/service
1Sashok: Здравствуйте, это канал об аниме?
2Да.
3Sashok: Как мне пропатчить KDE2 под FreeBSD?
1Sashok: Hello, is this the '#anime' channel?
2Yes.
3Sashok: How does one patch KDE2 under FreeBSD?
Astro.glob
function they provide. If you use this, you have to use it within a .astro file, and if you’re within a .astro file, then you can’t really import
a specific JS function. Luckily, Astro.glob is just a wrapper around import.meta.glob. This means we can use our own
glob function, and put it in a .ts file.import.meta.glob
to do what you want it to. If you’re underwhelmed by getCollection
and the other default tools that Astro
provides for content management, this is for you. For those uninitiated, say you have .mdx files inside of blogs/
, and your webpages inside of pages/
. You might use
either of these functions to not just parse these files, but Vite will do some really cool stuff with actually understanding the content of the file. Check the docs for more,
because our usecase is very specific: read .astro/.mdx files, grab the metadata from them, do some extra processing, and make a new route for them on the website. All we have
to do for this is create the file inside of blogs/
, and make sure the thing builds. If it builds, it’s good to go! 1export type BlogDetails = {
2 title: string;
3 date: string;
4 author: string;
5 overrideHref?: string;
6 overrideLayout?: boolean;
7 description?: string;
8 image?: string | string[];
9 tags?: string[];
10 hidden?: boolean;
11 aria?: { [x: string]: ImageAccessibility };
12};
1export type ImageAccessibility = {
2 alt: string; // a description of the image
3 role?: astroHTML.JSX.AriaRole; // list of image roles: https://developer.mozilla.org/en-US/docs/Web/Accessibility/ARIA/Roles#roles_defined_on_mdn
4 ariaDescribedby?: string; // if you describe the image in an HTML element, use give it an it like id="carpark-description". that way the screen reader can say "this div describes the image"
5 loading?: astroHTML.JSX.ImgHTMLAttributes["loading"]; // set to "eager" if image is essential to the post, "lazy" if it is not. default of this is lazy.
6};
7
8export type Image = {
9 url: string;
10 size: string;
11 ext: string;
12 filename: string;
13 fullname: string;
14 accessibility: ImageAccessibility;
15};
16
17export type Post = BlogDetails & {
18 id: number;
19 globbedImgs: Image[];
20 relativeUrl: string;
21 absoluteUrl: string;
22 dateObj: Date;
23 Component: AstroComponentFactory;
24};
Post
objects. 1---
2export const details: BlogDetails = {
3 title: "Images-in-the-terminal",
4 date: "2024-Jan-01",
5 author: "natalie",
6 image: ["LainLaugh.gif", "ncmpcpp.png"],
7 aria: {
8 "LainLaugh.gif": { alt: "an animated girl laughing" },
9 "ncmpcpp.png": { alt: "a terminal window with a music playing program open, complete with song picker and audio visualizer", },
10 },
11};
12---
1---
2title: Looking Forward to the Future
3date: 2024-Jan-20
4author: nathan
5image: excitementometer.jpg
6aria:
7 excitementometer.jpg:
8 alt: a gauge of excitement, towards high
9---
types.ts
, we can do something like this1export type BlogAstro = AstroInstance & {
2 details: BlogDetails;
3};
4
5export type BlogMdx = MDXInstance<BlogDetails>;
1export function extractMetadata(i: BlogAstro | BlogMdx): {
2 details: BlogDetails;
3 component: AstroComponentFactory;
4 dateObj: Date;
5} {
6 if ("details" in i) {
7 return {
8 details: i.details,
9 component: i.default,
10 dateObj: parseDateString(i.details.date).dateObj,
11 };
12 }
13 if ("frontmatter" in i) {
14 return {
15 details: i.frontmatter,
16 component: i.Content,
17 dateObj: parseDateString(i.frontmatter.date).dateObj,
18 };
19 }
20 // this throw won't fire unless you ignore typescript. I like just like errors (golang arc)
21 throw new Error(`Input: ${i} is not a valid BlogAstro or BlogMdx`);
22}
i.default
, or i.Content
represent the html of the file.parseDateString()
is just a function that either parses the date
or throws. We have it in another function just for the throwing angle (Did I mention I like to find bugs at compile time instead of runtime?)globBlogs()
1export async function globBlogs(
2 limit: number | undefined,
3 author: PossibleAuthors | undefined,
4 hideHidden: boolean | undefined
5): Promise<RGlobBlogs[]> {
6 let combined: Post[] = [];
7 const interim: ReturnType<typeof extractMetadata>[] = [];
8 const blogs = import.meta.glob<BlogAstro>("/src/blog/**/*.astro");
9 //^? Record<string, () => Promise<BlogAstro>>
10
11 for (const post in blogs) {
12 const f = await blogs[post]();
13 const g = extractMetadata(f);
14 interim.push(g);
15 }
16
17 const mdxs = import.meta.glob<BlogMdx>("/src/blog/**/*.mdx");
18
19 for (const post in mdxs) { // Note: "in" not "of"
20 const f = await mdxs[post]();
21 const g = extractMetadata(f);
22 interim.push(g);
23 }
24
25 interim.sort((a, b) => {
26 return a.dateObj.getTime() - b.dateObj.getTime();
27 });
.glob
are type assertions, so make sure your source of truth is… truthy. You could zod your way out of this
but to me, if you have good types and throw errors when you need to, zod is irrelevant in this situation..glob
needs to be a literal string - I don’t know how they know that it’s a variable but they do. It’s okay though,
because we get individual files by passing the file that we want as the index, and calling it as a function. 1 let count = 100001;
2 for (const p of interim) {
3 const id = count;
4 count++;
5 let href;
6 if (p.details.overrideHref) {
7 href = p.details.overrideHref;
8 } else {
9 href = `p/${id}`;
10 }
1 let imgs: Image[] = [];
2 if (typeof p.details.image === "string") {
3 imgs = await globImages(
4 [p.details.image],
5 p.dateObj.getFullYear().toString(),
6 p.details.aria
7 );
8 }
9
10 if (Array.isArray(p.details.image)) {
11 imgs = await globImages(
12 p.details.image,
13 p.dateObj.getFullYear().toString(),
14 p.details.aria
15 );
16 }
1 combined.push({
2 ...p.details,
3 id: id,
4 Component: p.component,
5 dateObj: p.dateObj,
6 relativeUrl: href,
7 absoluteUrl: `/${p.details.author}/${href}`,
8 globbedImgs: imgs,
9 });
10 }
1 combined = combined.sort((a, b) => b.dateObj.getTime() - a.dateObj.getTime());
2
3 combined.map((c) => pushBlogToDb(c));
4
5 if (author) {
6 combined = combined.filter((c) => parseAuthorName(c.author) === author);
7 }
8 if (limit) {
9 combined = combined.slice(0, limit);
10 }
11 if (hideHidden) {
12 combined = combined.filter((c) => c.hidden !== true);
13 }
14
15 return combined.map((c) => {
16 return {
17 params: { post: c.relativeUrl },
18 props: { c },
19 };
20 });
21}
getStaticPaths()
function expects a params key and an object where the key is what you put as the file name, so for us it’s [...post].astro
, and
the value being the url relative to that file. As for the props, it means we can do cool stuff with it instead of everything being a basic string. Let’s look
at globImages()
import.meta.glob()
is not just for this, it does a whole lot and I’m really interested in using it to convert our site from Astro to Go
when I feel comfortable doing so (the html templating just isn’t as good yet, and that’s important to me for Natalie). 1export async function globImages(
2 imgs: string[],
3 year: string,
4 aria: BlogDetails["aria"]
5): Promise<Image[]> {
6 const globber = import.meta.glob("/public/**/*.{jpg,gif,png,jpeg,bmp,webp}", {
7 as: "url",
8 });
9
10 let images: Image[] = [];
{ as: "url" }
so we don’t really care about the content of the file, but the location of it.
This function is only ever accessible through Vite, so it already knows where your root folder is and what the url would be
to get there. 1 let images: Image[] = [];
2
3 for (const img of imgs) {
4 const i = `/public/images/covers/${year}/${img}`;
5 const url = await globber[i]();
6
7 const fsPath = `.${i}`;
8 const size = fs.statSync(fsPath).size;
9 const ext = path.extname(fsPath);
10 const file = path.basename(fsPath, path.extname(fsPath));
11 const urlNoPublic = url.slice("/public".length);
/
is the root of the project, and not the root directory of the machine. Whenever that “just works”,
I’m really happy because I can have absolute paths and not be scared. But, understandably, the fs and path packages disagree, so we put a little dot behind it because the
astro build
(probably npm run build
for you) command is always ran from the root directory./a/
, so it can’t be a security thing. 1 if (!url || !urlNoPublic) {
2 throw new Error(`ERROR: ${url} undefined from ${imgs}`);
3 }
4
5 if (!aria || !aria[img]) {
6 console.log(`\n=====\nNo aria for the image ${img}. Consider adding one.\n=====\n`);
7 }
8 // else {
9 // console.log(`aria for ${img}:\n ${JSON.stringify(aria[img])}`);
10 // }
11
12 const defaultAria = { [img]: { alt: "" } };
13 const accessibility = { ...defaultAria, ...aria }[img];
14
15 images.push({
16 size: formatBytes(size),
17 ext: ext,
18 url: url,
19 filename: file,
20 fullname: `${file}${ext}`,
21 accessibility: accessibility,
22 });
23 }
24 return images;
25};
import.meta.glob()
but I hope it helps. Here’s an example [...post].astro
if you’re really stuck. 1---
2export const getStaticPaths: GetStaticPaths = async () => {
3 const g = await globBlogs(undefined, "nathan", false);
4 // console.log(g)
5 return g;
6};
7
8const props = Astro.props as RGlobBlogs["props"];
9---
10
11<NathanLayout details={props.c}>
12 <props.c.Component />
13</NathanLayout>
params
object, and we filtered for just nathan, there’s no limit for how many routes we want to make, and we’re not hidding hidden because even hidden
posts should have a url to them. The <props.c.Component/>
ends up getting put in the <slot/>
of the layout from @layouts/nathan/Root.astro
.\[U \underset{\iota}{\overset{g_{t_0}}{\rightleftarrows}} V \underset{\widetilde{\iota}}{\overset{h_1}{\rightleftarrows}} \{x\}\]
consts.ts
and it maps a filename to
an object that has the accessibility stuff as the value. So, in glob.ts
, when we grab
the images using import.meta.glob
, it tacks on this accessibility object to the image.
Also, comments coming soon. RSS a little later. Astro is not very interested in allowing
md/mdx/astro files all go into an RSS feed at once.import.meta.glob
to generate the sites here, because it’s 100% a hack and not the way either Astro or Vite
is documented. I also want to show off my
no-magic-stack at some point. In that
stack, I want to make a few sample web apps just to learn. It’s been so much fun figuring
out different things in Go, and I feel like I’m learning software fundamentals by using
it. After I get auth + todo app working, I’m going to make a twitch chat clone (websockets
seem really cool!), and I have a couple more ideas for after that: something to do with
wasm (I would need ideas on something sufficiently difficult for wasm to shine), and a
service status page. Part of this is to learn, part is to show off a cool stack for others
to replicate, and part is because I really want a twitch chat clone that uses
htmx/go/sqlc/tailwind to exist. I also want to flesh out thing like testing, stress
testing (as in requests/sec), and token-based api auth. Maybe you, Natalie, can make some
tutorial or documentation regarding the Latex stuff you had to go through recently (unless
it actually was documented - I just know you were having issues).\[(f_2f_1) \circ (g_1g_2) = f_2 \circ (f_1g_1) \circ g_2 \simeq f_2 \circ 1_Y \circ g_2 = f_2g_2 \simeq 1_Z\]
\[(g_1g_2) \circ (f_2f_1) = g_1 \circ (g_2f_2) \circ f_1 \simeq g_1 \circ 1_Z \circ f_1 = g_1f_1 \simeq 1_X\]
.go
files it generates, it’s not using its own library code for these queries. It uses
the builtin database/sql
for the connections and requests to the database url. It uses
github.com/lib/pq
for the Postgres part of the database and github.com/google/uuid
for
UUID types. I’m sure there are others, but this is a sense of how extremely typical and
unobtrusive this generated code is.a-h/templ
and htmx. templ is probably the
more difficult one to justify. You can think of templ as React functional components, but
aren’t very capable of running Go code within them. It’s really just html templating but
the primitives of actually using it are so much better than anything else. It’s a function
that returns html, accepts and gives LSP support for structs/maps/slices/strings/whatever.
You can’t put that complicated of logic inside of the templ components, but this is fine.
Some might go as far as to say that logic doesn’t belong at all in components that return
html.Okaaaaaaaaaay, it’s finally time to write some math runes to the web at long last and I’ll be working through hatcher’s algebraic topology, as I ran through it a couple years ago. And as many people do, I went through it too quickly, so that I have an actual foundation in the subject I’m running through the problems, each section has a multitude of problems and I’ll try to work through every other just so I’m not working on part 0 next year.
So the first question:
1. Construct an explicit deformation retraction of the torus with one point deleted onto a graph consisting of two circles intersecting in a point, namely, longitude and meridian circles of the torus.
So the first question of the intro section, this should be pretty simple and it i̸s (kind of is). As most first questions are we are going to need a couple definitions, formost what a deformation retract is:
A deformation retract of a space X onto a subspace A is a family of
maps ft : X → X,t
∈ I, such that f0 = ⊮, (the identity map), f1(X) = A, and ft|A
= ⊮∀t. The family ft should be continuous in the sense that the associated map
X × I
→ X, (x,t)ft(x), is continuous.
The first trick is to use the following construction where, the torus S1 ×S1 can be obtained by identifying opposite ends of the square, which can be seen here.
Let I = [−1,1] be an interval on ℝ2. Then I2 is a square. Without the loss of genererality we choose the origin to be the deleted point. So now we need to construct a map f on I2 ∖{0} which takes a lot of fiddling but you can find it to be:
Working from the requirements that f0(x,y) = (x,y) and f1(x,y) = as the latter is an element
of ∂I2 and to make
sure the restriction (ft(x,y)|∂I2 = (x,y)) holds is
because of max{|x|,|y|} =
1. I hope continuity should be easy to see. □
Alright one down, ... a ton to go, see you all tomorrow.
Nvm here’s another one cause its basically the same 2. Construct an explicit deformation retraction of ℝn ∖{0} onto Sn−1.
If you remember some linear algebra a vector in ℝn ∖{0}, is the normalised vector inside of Sn−1. So we just need the map to be
a normalisation process which is continuous. and is much like the previous question but
over n variables. ie
Just to check our bases the function ft is continuous for each t ∈ I, f0(x) = x, f1(x = x∕||x||, and ft(x)|Sn−1 = x due to the fact that ||x|| = 1 for all x. □
And if you thought I was doing these in order you’re out of luck bucko, I might make an index page that links to each question at some point, but that doesn’t exist yet so :3.
20. Show that the subspace X ⊂ ℝ3 formed by a Klein bottle intersecting itself in a circle as shown in Figure 1 is homotopy equivalent to S1 ∨ S1 ∨ S2.
Let X be he figure shown, intersecting itself at a circle C. The main
float-left
the whole time!). But I
made a few really solid improvements to the site and to Natalie’s ease of use in making
posts as well as customizing it in the future. I think the types I’ve set up for
everything is stable, and the code blocks are stable, and the images are stable, I’m
really happy with how this has turned out so far.toLocalString()
, it says it does
it in local time - but what the heck does that mean? We’re on serverless, and we
statically generate the html. I know there’s a way to include timezone as well in the
creation of the date object, so again I might do it I might not. For future reference, for
the time being, I’m in EST. 1
2#!/bin/bash
3export PS1=''
4
5UB_PID=10
6UB_SOCKET=""
7
8pkill -x "ueberzugpp" || true
9
10UB_PID_FILE="/tmp/.$(uuidgen)"
11ueberzugpp layer --no-stdin --silent --use-escape-codes --pid-file "$UB_PID_FILE"
12UB_PID=$(cat "$UB_PID_FILE")
13export UB_SOCKET="/tmp/ueberzugpp-$UB_PID.socket"
14
15CACHE=/tmp/albumcover
16
17while (true) do
18 if [ -e /tmp/albumflag ]; then
19 rm /tmp/albumflag
20 #SONG=$(cmus-remote -Q | sed -n '/^file/s/^file \(.*\)$/\1/p')
21 SONG=~/Music/"$(mpc --format %file% current)"
22 ffmpegthumbnailer -i "$SONG" -o "$CACHE" -s 500 -q 10
23 ueberzugpp cmd -s "$UB_SOCKET" -a add -i PREVIEW -x 0 -y 0 --max-width 200 --max-height 200 -f "$CACHE"
24 clear
25 #exiftool -Lyrics "$SONG" | sed -e 's/\.\.\+/\n/g' -e 's/\./\.\n/g'
26 fi
27 sleep 1
28done