Community opinion on describing targetted objects using AI? AstroBin Platform open discussions community forum · AmyWarble · ... · 40 · 797 · 0

This topic contains a poll.
How do you feel about this?
yes - this is fine, go right on ahead!
yes - but with limitations
no - this is never okay
I'd prefer you write your own descriptions but don't feel strongly about it
Turix 3.61
...
· 
·  1 like
·  Share link
I'd draw comparison to the widespread use of AI tools used during the processing of images, both here on AstroBin and in the wider community. The difference here at least is transparency; there is, I believe, a general expectation that you declare any major equipment / tooling you've used during capture and processing. Something that isn't necessarily true elsewhere.

That being said, personally, I see AI tooling (including LLMs) as just that, another tool - just another step in the workflow so to speak. Sure you can use an LLM to wholesale generate a description for you, but you then accept the risk that the output might, for example, contain factual errors, not "sound like you" or otherwise flow poorly. In the same way that, if you misuse AI deconvolution or noise reduction you can absolutely tell they've been used - and potentially ruin the resultant image.

I've actually been down this road myself, I started this year trying to use LLMs (Gemini in my case) to help with producing descriptions. At first I tried what is being suggested in this thread, wholesale generating large blocks of text based on prompts - an approach I used exactly once (M33). I never really liked the result, it just... didn't feel like me? Since then I've changed tactic, I now use them for two real purposes; producing talking points for me to write about or proofreading the final description for spelling/grammar/style. I've found this second approach far more.... comfortable.

I think my summary would be: knock yourself out. If you're concerned about what other people think, maybe consider the transparency point I talked about above. The description of the image, at least in my view, isn't the creative article we're really interested in here on AstroBin - its purpose is to simply describe the image.

Transparency - I used Gemini to check spelling, grammar and style on this post 
Like
Gondola 8.11
...
· 
·  5 likes
·  Share link
This thread is a good example as to why the use of AI to do simple things is an another step in making people in general, more reliant on technology and less able to do things without it. There are a lot of logical holes in this argument but hear me out.

As we become more reliant on technology we become less capable as human beings. I don't remember phone numbers anymore because I use a cell phone. I'm not very good at remembering complex directions anymore because I use navigation. When I moved to Tulsa a few years ago I was shocked at how hard it was for me to learn my way around without it. There are many examples like this where the use of technology results in loss of personal capability. Our brains simply don't have to work as hard to do everyday things.

The post about making art for D&D and this entire thread makes my point, I think it shows that the overuse of AI erodes learning and makes us less capable without it. If someone can easily use AI to make an avatar, there's less impetus to learn how to do it yourself. That's an opportunity for learning lost. If it's a struggle to write a few lines about an image you created, that's a warning that another basic skill is being degraded. Our grade school teachers would be shocked. Technology is great, we couldn't create the images we do without it, but, we have to be honest about what is doing the heavy lifting here because it's not us. 

I don't pretend to know what the answer is and there very well might not be one because at heart, we are lazy primates. I am convinced however that the use of technology and especially AI makes the human species, in a way, less capable. Worse than that, it makes us lazy and less able to survive. Everything is great until it isn't.
Like
macnenia 5.87
...
· 
·  3 likes
·  Share link
Tony Gondola:
This thread is a good example as to why the use of AI to do simple things is an another step in making people in general, more reliant on technology and less able to do things without it. There are a lot of logical holes in this argument but hear me out.

As we become more reliant on technology we become less capable as human beings. I don't remember phone numbers anymore because I use a cell phone. I'm not very good at remembering complex directions anymore because I use navigation. When I moved to Tulsa a few years ago I was shocked at how hard it was for me to learn my way around without it. There are many examples like this where the use of technology results in loss of personal capability. Our brains simply don't have to work as hard to do everyday things.

The post about making art for D&D and this entire thread makes my point, I think it shows that the overuse of AI erodes learning and makes us less capable without it. If someone can easily use AI to make an avatar, there's less impetus to learn how to do it yourself. That's an opportunity for learning lost. If it's a struggle to write a few lines about an image you created, that's a warning that another basic skill is being degraded. Our grade school teachers would be shocked. Technology is great, we couldn't create the images we do without it, but, we have to be honest about what is doing the heavy lifting here because it's not us. 

I don't pretend to know what the answer is and there very well might not be one because at heart, we are lazy primates. I am convinced however that the use of technology and especially AI makes the human species, in a way, less capable. Worse than that, it makes us lazy and less able to survive. Everything is great until it isn't.

I do understand the sentiment, but this is a Luddite view of the world. We are already using tools where basically we don't really know how they work, but we adjust a few parameters until we get what we want. Do you know the mathematics and direct computations going on for denoising or deconvolution? AI is just another tool that in theory can make life easier. If you ask me whether I would like to go back to having to remember phone numbers or to drive with a map book on my lap to navigate the streets, the answer is no. Even if that means my recall for numbers or my recall of driving routes is worse, that is a trade off I am willing to make.
Does a motor car make us lazy, because instead of that 10 minute drive we should walk for an hour.......yes perhaps. But that 50 mins could be spent doing something more useful like spending time with family.
All technology is a double edged sword and has its positives and negatives. AI certainly can create issues, but as was said "resistance is futile".
Like
CCDnOES 8.34
...
· 
·  2 likes
·  Share link
I am not opposed to AI content itself but I have to say that in general it is a bad idea for the people that make use of it.

AI use for finding facts, even if accurate, deprives people of the skills needed to discover and properly verify their own facts. In a world where many of our problems are due to sometimes highly questionable "facts", depriving people of those skills cannot be a good thing.
Like
Alan_Brunelle
...
· 
·  1 like
·  Share link
Niall MacNeill:
Tony Gondola:
This thread is a good example as to why the use of AI to do simple things is an another step in making people in general, more reliant on technology and less able to do things without it. There are a lot of logical holes in this argument but hear me out.

As we become more reliant on technology we become less capable as human beings. I don't remember phone numbers anymore because I use a cell phone. I'm not very good at remembering complex directions anymore because I use navigation. When I moved to Tulsa a few years ago I was shocked at how hard it was for me to learn my way around without it. There are many examples like this where the use of technology results in loss of personal capability. Our brains simply don't have to work as hard to do everyday things.

The post about making art for D&D and this entire thread makes my point, I think it shows that the overuse of AI erodes learning and makes us less capable without it. If someone can easily use AI to make an avatar, there's less impetus to learn how to do it yourself. That's an opportunity for learning lost. If it's a struggle to write a few lines about an image you created, that's a warning that another basic skill is being degraded. Our grade school teachers would be shocked. Technology is great, we couldn't create the images we do without it, but, we have to be honest about what is doing the heavy lifting here because it's not us. 

I don't pretend to know what the answer is and there very well might not be one because at heart, we are lazy primates. I am convinced however that the use of technology and especially AI makes the human species, in a way, less capable. Worse than that, it makes us lazy and less able to survive. Everything is great until it isn't.

I do understand the sentiment, but this is a Luddite view of the world. We are already using tools where basically we don't really know how they work, but we adjust a few parameters until we get what we want. Do you know the mathematics and direct computations going on for denoising or deconvolution? AI is just another tool that in theory can make life easier. If you ask me whether I would like to go back to having to remember phone numbers or to drive with a map book on my lap to navigate the streets, the answer is no. Even if that means my recall for numbers or my recall of driving routes is worse, that is a trade off I am willing to make.
Does a motor car make us lazy, because instead of that 10 minute drive we should walk for an hour.......yes perhaps. But that 50 mins could be spent doing something more useful like spending time with family.
All technology is a double edged sword and has its positives and negatives. AI certainly can create issues, but as was said "resistance is futile".

I've specifically "liked" the last few posts, to highlight how this discussion has me swinging wildly back and forth, because, as with even other posts above, I read one and say, yes that makes good sense, and then I read the next post as a counter-point to the previous one and say "yes that makes good sense".  Yes, to some extent technology has always created some of the benefits and problems that seem to be paramount in this discussion.  Maybe the issue here is that AI seems to be impacting society at an accelerating pace that it is we humans who cannot handle it?  Forget AI, a simple example that I have noticed lately, at least in my new community, is the increase in the number of electric bikes.  The benefit?  As my wife and I were biking (pedaling) along the local river trail, we saw more people roughly our age (mid too late 60s) "wheeling" along a comfortable pace without really pedaling or pedaling at all.  The good?  Well, people who might not bike anymore actually getting out and enjoying fresh air, and the sights.  However the bad?  In my neighborhood, so many kids out and about on their bikes.  Used to be that kids all had bikes and spent hours after school tooling around neighborhoods, up and down hills, etc.  Now they just push a button and go.  Unlike the adults, kids don't even do the fake pedaling.  Childhood obesity much?!

I am reticent to even bring this up, but people here will do what people here do.  It may not cause a stir over the next few years.  And as Niall states, there are elements of AI that we now find routine and acceptable.  I hope someone completely disagrees with me to what I am about to say and shows me why this is not going to happen, but what of the following:  In the not very distant future, someone new will join AstroBin and start posting images.  These will be quite nice and Top Picks will come to this person.  Eventually this person will start winning IOTDs, etc., etc.  And then it will be discovered that this person is not even a person.  This being a Turing Test for someone to prove a point with an ugly result.  Will people here then sit up and take notice?
Edited ...
Like
EricGagne 3.82
...
· 
·  Share link
It may have been said before, tbh I didn’t read the whole topic.

I think AI descriptions are fine as long as there’s a note that it is AI generated so people know any of the information can be totally wrong.

It could be a very good tool for people who don’t speak English and want to describe their shooting and processing workflows.  With the correct prompts and descriptions AI can create a much more pleasing text than google translate would ever do.

I don’t know every AI bots out there but I have use Copilot recently and I like that it includes links to references in every answers.  It makes it easy to check if the answer comes from legit sources or is all made up.
Edited ...
Like
CCDnOES 8.34
...
· 
·  2 likes
·  Share link
Alan Brunelle:
In the not very distant future, someone new will join AstroBin and start posting images.  These will be quite nice and Top Picks will come to this person.  Eventually this person will start winning IOTDs, etc., etc.  And then it will be discovered that this person is not even a person.


Sadly, I agree that this is inevitable. 
Like
Habib_Sekha 1.51
...
· 
·  1 like
·  Share link
Let’s see what ChatGPT thinks about this 




Yes, it's absolutely okay to use AI to write a description about an astronomy picture — and in many cases, it's even encouraged.

Here’s why it can be a good idea:✅ Benefits of Using AI for Astronomy Picture Descriptions:
    [*]
    Clarity: AI can explain complex astronomical concepts in simpler terms for a wider audience.
    [*]
    Speed: It helps you generate content quickly, which is especially useful for educational or outreach purposes.
    [*]
    Creativity: AI can help you craft poetic or inspiring descriptions that enhance the viewer's appreciation of the image.
    [*]
    Accessibility: Well-written descriptions can make visuals more accessible to people with visual impairments.

⚠️ A Few Best Practices:
    [*]
    Fact-check the output: Make sure the scientific details are accurate. AI might misidentify celestial objects or phenomena.
    [*]
    Customize for your audience: Tailor the tone and depth of detail to match whether you're addressing students, amateur astronomers, or the general public.
    [*]
    Credit original sources: If the image comes from a space agency or telescope archive (like NASA, ESA, Hubble, JWST, etc.), it’s best to credit them.



If you'd like, you can upload an astronomy picture here, and I can help generate a description tailored to your needs!



Source: ChatGPT
Like
Gondola 8.11
...
· 
·  1 like
·  Share link
Niall MacNeill:
Tony Gondola:
This thread is a good example as to why the use of AI to do simple things is an another step in making people in general, more reliant on technology and less able to do things without it. There are a lot of logical holes in this argument but hear me out.

As we become more reliant on technology we become less capable as human beings. I don't remember phone numbers anymore because I use a cell phone. I'm not very good at remembering complex directions anymore because I use navigation. When I moved to Tulsa a few years ago I was shocked at how hard it was for me to learn my way around without it. There are many examples like this where the use of technology results in loss of personal capability. Our brains simply don't have to work as hard to do everyday things.

The post about making art for D&D and this entire thread makes my point, I think it shows that the overuse of AI erodes learning and makes us less capable without it. If someone can easily use AI to make an avatar, there's less impetus to learn how to do it yourself. That's an opportunity for learning lost. If it's a struggle to write a few lines about an image you created, that's a warning that another basic skill is being degraded. Our grade school teachers would be shocked. Technology is great, we couldn't create the images we do without it, but, we have to be honest about what is doing the heavy lifting here because it's not us. 

I don't pretend to know what the answer is and there very well might not be one because at heart, we are lazy primates. I am convinced however that the use of technology and especially AI makes the human species, in a way, less capable. Worse than that, it makes us lazy and less able to survive. Everything is great until it isn't.

I do understand the sentiment, but this is a Luddite view of the world. We are already using tools where basically we don't really know how they work, but we adjust a few parameters until we get what we want. Do you know the mathematics and direct computations going on for denoising or deconvolution? AI is just another tool that in theory can make life easier. If you ask me whether I would like to go back to having to remember phone numbers or to drive with a map book on my lap to navigate the streets, the answer is no. Even if that means my recall for numbers or my recall of driving routes is worse, that is a trade off I am willing to make.
Does a motor car make us lazy, because instead of that 10 minute drive we should walk for an hour.......yes perhaps. But that 50 mins could be spent doing something more useful like spending time with family.
All technology is a double edged sword and has its positives and negatives. AI certainly can create issues, but as was said "resistance is futile".

I don't disagree but Luddite might be a strong term.
Like
Astromonkey 7.83
...
· 
·  2 likes
·  Share link
AmyWarble:
Aaron Lisco:
This adds to the discussion  i started on separating Remote Image capture where you do not maintain nor own the equipment only pay for data . others do the heavy lifting 

You simply process the purchased  top of the line data 

At what point do  you stop calling your self and astophotographer and start calling your self a DATA processor 

If you start using AI to add descriptions then are you really vested in the image you capture or just pumping out pictures, and letting AI tell us and you about it.

Uh, what?  No I certainly did not purchase "top of the line data".  I'm an astrophotographer because I take pictures of the night sky and process those pictures into something pretty.  I would stop calling myself an astrophotographer when I stop doing that.  If I don't write descriptions at all, I would still be an astrophotographer.

What nonsense.

  I was not saying your purchasing data ...I was just pointing out that  this is the beginning of the slippery slope of AI and NON accusation of DATA  ...and the need to separate the Images into those groups ...they are all fine but the distinction needs to be drawn and labeled as such..... I was NOT accusing you of anything sorry to cause offense...I agree if you take the data and process it then YES you are and Astrophotographer even if AI writes the description ...If you Purchase Data or do not maintain a Telescope then you are in my mind a DATA collector and processor ONLY!
Like
AccidentalAstronomers 18.64
...
· 
·  2 likes
·  Share link
Here on Astrobin, I'm not generally interested in information about the object unless it's something unique or little known. I can't imagine there's a whole lot to say about M31 that we don't all know. I'm much more interested in how people practice this craft, why they do it that way, and what interests them about the target. Even better--tell me a true story about it that makes me laugh or makes me care in some way.

I'm not sure there's an AI out there that can put that together--and if there is, well, that's friggin' creepy. If I want to know what's in google or ChatGPT results, I can get that myself if I'm so inclined. When it comes to posting to other platforms like FB or Insta, it makes some sense to include some pedantry about the target because that's a completely different audience. But I don't think questionable AI prose plagiarized equally from astronomers and flat earthers has much utility here.
Like
Alan_Brunelle
...
· 
·  1 like
·  Share link
Timothy Martin:
Here on Astrobin, I'm not generally interested in information about the object unless it's something unique or little known. I can't imagine there's a whole lot to say about M31 that we don't all know. I'm much more interested in how people practice this craft, why they do it that way, and what interests them about the target. Even better--tell me a true story about it that makes me laugh or makes me care in some way.

I'm not sure there's an AI out there that can put that together--and if there is, well, that's friggin' creepy. If I want to know what's in google or ChatGPT results, I can get that myself if I'm so inclined. When it comes to posting to other platforms like FB or Insta, it makes some sense to include some pedantry about the target because that's a completely different audience. But I don't think questionable AI prose plagiarized equally from astronomers and flat earthers has much utility here.

I agee with what you say here Tim.  I personally tend to write too much in my descriptions, but that comes from my relationships that I developed with other AstroBinners who seem to like to discuss some of the things I go on about.  And I like what they go on about.  This stuff can and should be entertaining.  These are what I consider true friends.  But really, I write the descriptions as sort of a diary to myself, so generally what I write about is not the trivial facts that can be had on wiki, etc.  A lot of the processing stuff I write is really for myself.  But here, I usually restrict what I say to my preferences and issues I have with some of the current processes, and norms.  If I am able to do this in some form for many more years, it will be interesting to go back and read how I have changed. 

I also want to provide information to my family, if they should ever visit the site, now or after I am long gone.  In that, I tend to stay on a higher level of what these objects are and what they mean in the scheme of current astrophysics.  My two kids are likely not interested in astrophotography per se, but both are scientists as is my wife.  As you can tell, while I like to make images that are nice, I do not put the effort into the aesthetics of my images to ever really consider myself someone whom others will learn from regarding that part of the skill. 

In other words, I do not think AI will ever be me, as you suggest in your statement.  And if for some reason I should post such an AI description, for those who enjoy (or suffer through) what I typically post, will immediately know that there is something wrong with me and be concerned for my health!
Like
warble_master 12.34
Topic starter
...
· 
·  1 like
·  Share link
Tony Gondola:
This thread is a good example as to why the use of AI to do simple things is an another step in making people in general, more reliant on technology and less able to do things without it. There are a lot of logical holes in this argument but hear me out.

As we become more reliant on technology we become less capable as human beings. I don't remember phone numbers anymore because I use a cell phone. I'm not very good at remembering complex directions anymore because I use navigation. When I moved to Tulsa a few years ago I was shocked at how hard it was for me to learn my way around without it. There are many examples like this where the use of technology results in loss of personal capability. Our brains simply don't have to work as hard to do everyday things.

The post about making art for D&D and this entire thread makes my point, I think it shows that the overuse of AI erodes learning and makes us less capable without it. If someone can easily use AI to make an avatar, there's less impetus to learn how to do it yourself. That's an opportunity for learning lost. If it's a struggle to write a few lines about an image you created, that's a warning that another basic skill is being degraded. Our grade school teachers would be shocked. Technology is great, we couldn't create the images we do without it, but, we have to be honest about what is doing the heavy lifting here because it's not us. 

I don't pretend to know what the answer is and there very well might not be one because at heart, we are lazy primates. I am convinced however that the use of technology and especially AI makes the human species, in a way, less capable. Worse than that, it makes us lazy and less able to survive. Everything is great until it isn't.

You have a strong point when it comes to kids.  Kids should not use LLMs, or drink beer, or engage in any of the plethora of activities adults can engage in.  Doing so can impact their neurological/cognitive development because as you said, they aren't learning skills, possibly even including how to reason.  Their development can be degraded because they've off-loaded thinking to the LLM.  I hold firm to this belief: LLMs should be regulated so you must be 18 or older to use them at all.  We need regulations around the ethical use of AI; people should not be allowed to sell a book written by ChatGPT while passing it off as their own work.

I worry about kids today.  Before AI came on the scene kids entering college in the US can frequently struggle with reading.  Now that we have AI...

I've had the same experience as you regarding cell phones and maps.  I have three phone numbers memorized in case of emergency, and that's it (four if you count 911).  However, I think a similar argument could be made against the specialization of skills and labor in general.  I don't know how to grow food.  I've never needed to farm.  I also don't know the first thing about building a house or barn, unless its made of Legos. These are skills I never needed to develop because I can rely on others to do it for me even as I specialize in programming and music.  My inability to farm or build structures that stand up for more than ten minutes is a skill loss, sure, but it's made up for with other skills.  I view AI in the same vein.  It can free up some kinds of labor so I can focus on and specialize in others.  The loss of a skill doesn't necessarily mean a net loss or that survival is impacted.

As an aside, I worry about the next Carrington Event.  We are so reliant on modern technology that a CE would be cataclysmic.  But that's neither here nor there.

The skill loss argument doesn't apply to my gaming group (AFAIK).  We're all in our 40s (one guy in his 30s).  I have never been able to draw anything beyond basic stick figures and fluffy clouds.  I literally have zero talent when it comes to drawing.  It wasn't for lack of trying.  Yet, my time and energy were freed to learn other skills with much less effort and much more gain.  No skills are getting degraded, just moved around.  When I was able to create art for the first time in my life, I was elated.  It was like receiving a stool so I could see over the fence.  Using a stool to get higher doesn't cause harm.  I now have a capability, provided by a tool, that makes my world bigger, not smaller, and my time is free to explore other skills.

I am basically arguing, without any scientific basis to back this up, that using a tool to achieve some goal frees the mind up to learn other skills.  At least, for adults.  Kids shouldn't use LLMs at all.

I would be interested to see if anyone here can come up with a (decent) ballad on the fly while in the middle of a gaming session.  None of my gaming group are songwriters or have any training as stand-up comics (who learn to think quickly on their feet).  We're playing a game that already consumes a lot of time.  And no one in my gaming group is denying that AI is doing the heavy lifting when it comes to ballads and avatars. 

There are many people who claim AI-produced content as their own, but that is unethical behavior and isn't an intrinsic property of the AI itself.  Some human-made tools don't have ethical uses.  Until we're defending the planet against alien spaceships, I'd argue that nuclear weapons are always unethical.  I don't believe LLMs fall into that category.  We can make ethical use of them.  It is very easy and tempting to use them unethically.
Aaron Lisco:
I was not saying your purchasing data ...I was just pointing out that  this is the beginning of the slippery slope of AI and NON accusation of DATA  ...and the need to separate the Images into those groups ...they are all fine but the distinction needs to be drawn and labeled as such..... I was NOT accusing you of anything sorry to cause offense...I agree if you take the data and process it then YES you are and Astrophotographer even if AI writes the description ...If you Purchase Data or do not maintain a Telescope then you are in my mind a DATA collector and processor ONLY!

Yeah, sorry I jumped the gun on that.  I see how I misread your post.  We're all good.
Edited ...
Like
Astromonkey 7.83
...
· 
·  Share link
Bill McLaughlin:
Alan Brunelle:
In the not very distant future, someone new will join AstroBin and start posting images.  These will be quite nice and Top Picks will come to this person.  Eventually this person will start winning IOTDs, etc., etc.  And then it will be discovered that this person is not even a person.


Sadly, I agree that this is inevitable. 

Look at Pinterest they say 70% of ALL photos on that site are AI..!  Even Instagram Pics  and Videos made with AI to get likes
Like
AccidentalAstronomers 18.64
...
· 
·  1 like
·  Share link
Aaron Lisco:
made with AI


Thanks to Russ Croman, roughly 100% of pics posted here are "made with AI." But I get the distinction. The photos you're talking about were made entirely by AI. Still, it's an increasingly gray area here.
Like
Gondola 8.11
...
· 
·  1 like
·  Share link
Aaron Lisco:
Bill McLaughlin:
Alan Brunelle:
In the not very distant future, someone new will join AstroBin and start posting images.  These will be quite nice and Top Picks will come to this person.  Eventually this person will start winning IOTDs, etc., etc.  And then it will be discovered that this person is not even a person.


Sadly, I agree that this is inevitable. 

Look at Pinterest they say 70% of ALL photos on that site are AI..!  Even Instagram Pics  and Videos made with AI to get likes

I've also noticed that more and more thumbnails on YouTube are AI and more than a few YouTube channels and posts are 100 percent AI. This has been a change that's happened very quickly, like over the last 6 months. I can tell the difference but I'm not sure everyone can.
Like
 
Register or login to create to post a reply.